Better batteries charge forward

Everybody wants more juice from their batteries. Smartphones and laptops always need recharging. Electric car drivers must carefully plan their routes to avoid being stranded far from a charging station. Anyone who struggles with a tangle of chargers every night would prefer a battery that can last for weeks or months.

For researchers who specialize in batteries, though, the drive for a better battery is less about the luxury of an always-charged iPad (though that would be nice) and more about kicking our fossil fuel habit. Given the right battery, smog-belching cars and trucks could be replaced with vehicles that run whisper-quiet on electricity alone. No gasoline engine, no emissions. Even airplanes could go electric. And the power grid could be modernized to use cheaper, greener fuels such as sunlight or wind even on days when the sun doesn’t shine bright enough or the wind doesn’t blow hard enough to meet electricity demand.

A better battery has the potential to jolt people into the future, just like the lithium-ion battery did. When they became popular in the early 1990s, lithium-ion batteries offered twice as much energy as the next best alternative. They changed the way people communicate.

“What the lithium-ion battery did to personal electronics was transformational,” says materials scientist George Crabtree, director of the Joint Center for Energy Storage Research at Argonne National Laboratory in Illinois. “The cell phone not only made landlines obsolete for many, but [the lithium-ion battery] put cameras and the internet into the hands of millions.” That huge leap didn’t happen overnight. “It was the sum of many incremental steps forward, and decades of work,” says Crabtree, who coordinates battery research in dozens of U.S. labs.
Lithium-ion batteries have their limits, however, especially for use in the power grid and in electric vehicles. Fortunately, like their Energizer mascot, battery researchers never rest. Over the last 10 years, universities, tech companies and car manufacturers have explored hundreds of new battery technologies, reaching for an elusive and technically difficult goal: next-generation batteries that hold more energy, last longer and are cheaper, safer and easier to recharge.

A decade of incremental steps are beginning to pay off. In late 2017, scientists will introduce a handful of prototype batteries to be developed by manufacturers for potential commercialization. Some contain new ingredients — sulfur and magnesium — that help store energy more efficiently, delivering power for longer periods. Others will employ new designs.
“These prototypes are proof-of-principle batteries, miniature working versions,” Crabtree says. Getting the batteries into consumer hands will take five to 10 years. Making leaps in battery technology, he says, is surprisingly hard to do.

Power struggle
Batteries operate like small chemical plants. Technically, a battery is a combination of two or more “electrochemical cells” in which energy released by chemical reactions produces a flow of electrons. The greater the energy produced by the chemical reactions, the greater the electron flow. Those electrons provide a current to whatever the battery is powering — kitchen clock, smoke alarm, car engine.

To power any such device, the electrons must flow through a circuit connecting two electrodes, known as an anode and a cathode, separated by a substance called an electrolyte. At the anode, chemical oxidation reactions release electrons. At the cathode, electrons are taken up in reduction reactions. The electrolyte enables ions created by the oxidation and reduction reactions to pass back and forth between the two electrodes, completing the circuit.

Depending on the materials used for the electrodes and the electrolyte, a battery may be recharged by supplying current that drives the chemical reactions in reverse. In creating new recipes for a rechargeable electrochemical soup, though, battery researchers must beware of side reactions that can spoil everything.
“There’s the chemical reaction you want — the one that stores energy and releases it,” Crabtree says. “But there are dozens of other … reactions that also take place.” Those side reactions can disable a battery, or worse, lead to a risk of catastrophic discharge. (Consider the recent fires in Samsung’s Galaxy Note 7 smartphones.)

Early versions of the lithium-ion battery from the 1970s carried an anode made of pure lithium metal. Through repeated use, lithium ions were stripped off and replated onto the anode, creating fingerlike extensions that reached across to the cathode, shorting out the battery. Today’s lithium-ion batteries have an anode made of graphite (a form of carbon) so that loose lithium ions can snuggle in between sheets of carbon atoms.

Lithium-ion batteries were originally developed with small electronics in mind; they weren’t designed for storing electricity on the grid or powering electric vehicles. Electric cars need lots of power, a quick burst of energy to move from stop to start. Electric car manufacturers now bundle thousands of such batteries together to provide power for up to 200 miles before recharging, but that range still falls far short of what a tank of gas can offer. And lithium-ion batteries drain too quickly to feed long hours of demand on the grid.

Simply popping more batteries into a car or the grid isn’t the answer, Crabtree says. Stockpiling doesn’t improve the charging time or the lifetime of the battery. It’s also bulky. Carmakers have to leave drivers room for their passengers plus some trunk space. To make electric vehicles competitive with, or better than, vehicles run by internal-combustion engines, manufacturers will need low-cost, high-energy batteries that last up to 15 years. Likewise, grid batteries need to store energy for later use at low cost, and stand up to decades of use.

“There’s no one battery that’s going to meet all our needs,” says MIT materials scientist Yet-Ming Chiang. Batteries needed for portable devices are very different from those needed for transportation or grid-scale storage. Expect to see a variety of new battery types, each designed for a specific application.
Switching to sulfur
For electric vehicles, lithium-sulfur batteries are the next great hope. The cathode is made mainly of sulfur, an industrial waste product that is cheap, abundant and environmentally friendly. The anode is made of lithium metal.

During discharge, lithium ions break away from the anode and swim through the liquid electrolyte to reach the sulfur cathode. There, the ions form a covalent bond with the sulfur atoms. Each sulfur atom bonds to two lithium ions, rather than just one, doubling the number of bonds in the cathode of the battery. More chemical bonds means more stored energy, so a lithium-sulfur battery creates more juice than a lithium-ion one. That, combined with sulfur’s light weight, means that, in principle, manufacturers can pack more punch for a given amount of weight, storing four or five times as much energy per gram.

Ultimately, that upgrade could boost an electric vehicle’s range up to as much as 500 miles on a single charge. But first, researchers have to get past the short lifetime of existing lithium-sulfur batteries, which, Crabtree says, is due to a loss of lithium and sulfur in each charge-discharge cycle.

When lithium combines with sulfur, it also forms compounds called polysulfides, which quickly gum up the battery’s insides. Polysulfides form within the cathode during battery discharge, when stored energy is released. Once they dissolve in the battery’s liquid electrolyte, the polysulfides shuttle to the anode and react with it, forming a film that renders the battery useless within a few dozen cycles — or one to two months of use.

At Sandia National Laboratories in Albuquerque, N.M., a team led by Kevin Zavadil is trying to block the formation of polysulfides in the electrolyte. The electrolyte consists of salt and a solvent, and current lithium-sulfur batteries require a large amount of electrolyte to achieve a moderate life span. Zavadil and his team are developing “lean” electrolyte mixtures less likely to dissolve the sulfur molecules that create polysulfides.

Described September 9 in ACS Energy Letters, the new electrolyte mix contains a higher-than-usual salt concentration and a “sparing” amount of solvent. The researchers also reduced the overall amount of electrolyte in the batteries. In test runs, the tweaks dropped the concentration of polysulfides by several orders of magnitude, Zavadil says.
“We [also] have some ideas on how to use membranes to protect the lithium surface to prevent polysulfides from happening in the first place,” Zavadil says. The goal is to produce a working prototype of the new battery — one that can last through thousands of cycles — by the end of 2017.

At the University of Texas at Austin, materials engineer Guihua Yu, along with colleagues at Zhejiang University of Technology in Hangzhou, China, is investigating another work-around for this battery type: replacing the solid sulfur cathode with an intricate structure that encapsulates the sulfur in an array of nanotubes. Reported in the November issue of Nano Letters, the nanotubes that encase the sulfur are fashioned from manganese dioxide, a material that can attract and hold on to polysulfides. The nanotubes are coated with polypyrrole, a conductive polymer that helps boost the flow of electrons.

This approach reduces buildup and boosts overall conductivity and efficiency, Yu says. So far, with the group’s new architecture, the battery loses less than 0.07 percent of its capacity per charge and discharge cycle. After 500 cycles, the battery maintained about 65 percent of its original capacity, a great improvement over the short lifetime of current lithium-sulfur batteries. Still, for use in electric vehicles, scientists want batteries that can last through thousands of cycles, or 10 to 15 years.
Scientists at Argonne are constructing another battery type: one that replaces lithium ions at the anode with magnesium. This switch could instantly boost the electrical energy released for the same volume, says Argonne materials engineer Brian Ingram, noting that a magnesium ion has a charge of +2, double that of lithium’s +1. Magnesium’s ability to produce twice the electrical current of lithium ions could allow for smaller, more energy-dense batteries, Ingram says.

Magnesium comes with its own challenge, however. Whereas lithium ions zip through a battery’s electrolyte, magnesium ions slowly trudge. A team of researchers at Argonne, Northwestern University and Oak Ridge National Laboratory shot high-energy X-rays at magnesium in various batteries and learned that the drag is due to interactions with molecules that the magnesium attracts within the electrolyte. Ingram and his group are experimenting with new materials to find a molecular recipe that reduces such drag.

Ingram’s team is trying to nudge its “highly functioning, long-lasting” prototype to 3 volts by December. Today’s typical lithium-ion battery has 3.8 to 4 volts. At 3 volts, Ingram says, the magnesium battery would pack more power than a 4-volt lithium-ion battery and “create a tremendous amount of excitement within the field.”

Going with the flow
Together, transportation and the electricity grid account for about two-thirds of U.S. energy use. But today, only 10 percent of the electricity on the grid is from renewable sources, according to the U.S. Energy Information Administration. If wind and solar power are ever to wrestle energy production away from fossil fuels, big changes must happen in energy storage.
What is needed, Crabtree says, is a battery that can store energy, and lots of it, for later use. “Though the sun shines in the middle of the afternoon, peak demand comes at sunset when people go home, turn on lights and cook dinner,” he says.

To reliably supply electricity at night or on cloudy, windless days requires a different type of battery. By design, flow batteries fit the bill. Instead of having solid electrodes, flow batteries store energy in two separate tanks filled with chemicals — one positively charged, the other negatively charged. Pumps move the liquids from the tanks into a central chamber, or “stack,” where dissolved molecules in the liquids undergo chemical reactions that store and give up energy. A membrane located in the stack keeps the positive and negative ions separated.
Flow batteries can store energy for a long time and provide power as needed. Because the energy-storing liquids are kept in external tanks, the batteries are unlikely to catch fire, and can be built large or small depending on need. To store more power, use a larger tank.

So far, however, flow batteries are expensive to make and maintain, and have had limited use for providing backup power on the grid. Today’s flow batteries are packed with rare and toxic metal components, usually vanadium. With many moving parts — tanks, pumps, seals and sensors — breakdowns and leakage are common.

At MIT, Chiang and colleagues are developing flow batteries that can bypass those drawbacks. One, an hourglass flow battery, does away with the need for costly and troublesome pumps. The stack where the chemical reactions occur is in the constricted middle, with tanks at either end. Gravity allows the liquids to flow through the stack, like sand in an hourglass. A motor adjusts the battery’s angle to speed or slow the flow.

The hourglass design is like a “concept car,” Chiang says. Though the final product is likely to take a slightly different shape, the design could serve as a model for future flow batteries. Simply changing the tilt of the device could add a short infusion of power to the grid during periods of peak demand, or slowly release energy over a period of many hours to keep air conditioners and heaters running when the sun is down.

In another design, the group has replaced vanadium with sulfur, which is inexpensive and abundant. Dissolved in water (also cheap and plentiful), sulfur is circulated into and out of the battery’s stack, creating a reaction that stores or gives up energy, similar to commercial flow batteries. The group is now refining the battery, first described in 2014 in Nano Letters, aiming for higher levels of energy.

Another challenge in developing flow batteries is finding ways to keep active materials confined to their tanks. That’s the job of the battery membrane, but because the organic molecules under consideration for battery use are almost always small, they too easily slip through the membrane, reducing the battery’s lifetime and performance.
Rather than change the membrane, a group led by chemist Joaquín Rodríguez-López of the University of Illinois at Urbana-Champaign devised ways to bulk up the battery’s active materials by changing their size or configuration. The scientists linked tens to millions of active molecules together to create large, ringed structures, long strings of molecules hooked onto a polymer backbone, or suspensions of polymers containing up to a billion molecules, they reported in the Journal of the American Chemical Society in 2014.

With the oversized molecules, even “simple, inexpensive porous membranes are effective at preventing crossover,” Crabtree says. A prototype flow battery that provides low-cost power and lasts 20 to 30 years is expected to be completed in the coming year.

Getting air
Looking beyond 2017, scientists envision a new generation of batteries made of low-cost, or even no-cost, materials. The lithium-air battery, still in early development, uses oxygen sucked in from the atmosphere to drive the chemical reaction that produces electricity. In the process, oxygen combines with lithium ions to form a solid compound (lithium peroxide). During charging, solid oxygen reverts back to its gaseous form.

“Lithium-air potentially offers the highest energy density possible,” says MIT materials engineer Ju Li. “You basically react lithium metal with oxygen in air, and in principle you get as much useful energy as gasoline.”

Lithium-air has problems, though. The batteries are hard to recharge, losing much of their power during the process. And the chemical reaction that powers the battery generates heat, cutting the battery’s energy-storage capacity and life span.

Using electron microscopy to study the reaction products of a lithium-air prototype, Li and his group came up with a possible solution: Keep oxygen in a solid form sealed within the battery to prevent the oxygen from forming a gas. By encasing oxygen and lithium in tiny glasslike particles, the scientists created a fully sealed battery. The new strategy, published July 25 online in Nature Energy, curbed energy loss during recharging and prevented heat buildup.

“If it works on a large scale, we would have an electrical vehicle that’s competitive with gasoline-driven cars,” Li says. Reaching that goal would be a big step toward a greener planet.

Chemists strike gold, solve mystery about precious metal’s properties

Gold’s glimmer is not the only reason the element is so captivating. For decades, scientists have puzzled over why theoretical predictions of gold’s properties don’t match up with experiments. Now, highly detailed calculations have erased the discrepancy, according to a paper published in the Jan. 13 Physical Review Letters.

At issue was the energy required to remove an electron from a gold atom, or ionize it. Theoretical calculations of this ionization energy differed from scientists’ measurements. Likewise, the energy released when adding an electron — a quantity known as the electron affinity — was also off the mark. How easily an atom gives up or accepts electrons is important for understanding how elements react with other substances.
“It was well known that gold is a difficult system,” says chemist Sourav Pal of the Indian Institute of Technology Bombay, who was not involved with the study. Even gold’s most obvious feature can’t be explained without calling Einstein’s special theory of relativity into play: The theory accounts for gold’s yellowish color. (Special relativity shifts around the energy levels of electrons in gold atoms, causing the metal to absorb blue light, and thereby making reflected light appear more yellow.)

With this new study, scientists have finally resolved the lingering questions about the energy involved in removing or adding an electron to the atom. “That is the main significance of this paper,” Pal says.

Early calculations, performed in the 1990s, differed from the predicted energies by more than a percent, and improved calculations since then still didn’t match the measured value. “Every time I went to a conference, people discussed that and asked, ‘What the hell is going on?’” says study coauthor Peter Schwerdtfeger, a chemist at Massey University Auckland in New Zealand.

The solution required a more complete consideration of the complex interplay among gold’s 79 electrons. Using advanced supercomputers to calculate the interactions of up to five of gold’s electrons at a time, the scientists resolved the discrepancy. Previous calculations had considered up to three electrons at a time. Also essential to include in the calculation were the effects of special relativity and the theory of quantum electrodynamics, which describes the quantum physics of particles like electrons.

The result indicates that gold indeed adheres to expectations — when calculations are detailed enough. “Quantum theory works perfectly well, and that makes me extremely happy,” says Schwerdtfeger.

Spin may reveal black hole history

WASHINGTON — Researchers have devised a test to see if pairs of black holes — famous for creating gravitational waves when they merge — themselves formed from multiple mergers of smaller black holes.

The Advanced Laser Interferometer Gravitational-Wave Observatory, LIGO, has detected spacetime ripples from two sets of merging black holes (SN: 7/9/16, p. 8). Scientists typically assume that such black holes formed in the collapse of a massive star. But in especially crowded patches of the universe, black holes could have formed over generations of unions, astrophysicist Maya Fishbach of the University of Chicago explained January 28 at a meeting of the American Physical Society. Or the merging cycle could have occurred in the very early universe, starting with primordial black holes — objects that may have formed when extremely dense pockets of matter directly collapsed.
Fishbach and colleagues studied how quickly black holes whirl around. In simulations, black holes that repeatedly merged reached a high rate of spin, the scientists found. That result didn’t depend on certain properties of the initial black holes, like whether they were spinning to begin with or not. “It’s cool,” says Fishbach. “The predictions from this in terms of spin are very robust,” making the idea easy to test.

So far, the spins of LIGO’s black holes are lower than the predictions. If the multiple merging process occurs, it could be very rare, so to conclusively test the idea would require tens to hundreds of black hole detections, Fishbach says.

Long-lasting mental health isn’t normal

Abnormal is the new normal in mental health.

A small, poorly understood segment of the population stays mentally healthy from age 11 to 38, a new study of New Zealanders finds. Everyone else encounters either temporary or long-lasting mental disorders.

Only 171 of 988 participants, or 17 percent, experienced no anxiety disorders, depression or other mental ailments from late childhood to middle age, researchers report in the February Journal of Abnormal Psychology. Of the rest, half experienced a transient mental disorder, typically just a single bout of depression, anxiety or substance abuse by middle age.
“For many, an episode of mental disorder is like influenza, bronchitis, kidney stones, a broken bone or other highly prevalent conditions,” says study coauthor Jonathan Schaefer, a psychologist at Duke University. “Sufferers experience impaired functioning, many seek medical care, but most recover.”

The remaining 408 individuals (41 percent) experienced one or more mental disorders that lasted several years or more. Their diagnoses included more severe conditions such as bipolar and psychotic disorders.

Researchers analyzed data for individuals born between April 1972 and March 1973 in Dunedin, New Zealand. Each participant’s general health and behavior were assessed 13 times from birth to age 38. Eight mental health assessments occurred from age 11 to 38.
Surprisingly, those who experienced lasting mental health did not display several characteristics previously linked to a lower likelihood of developing mental disorders. Those attributes consist of growing up in unusually affluent families, enjoying especially sound physical health and scoring exceptionally high on intelligence tests.
Instead, mentally healthy participants tended to possess advantageous personality traits starting in childhood, Schaefer and colleagues found. These participants rarely expressed strongly negative emotions, had lots of friends and displayed superior self-control. Kiwis with rock-solid mental health also had fewer first- and second-degree relatives with mental disorders compared with their peers.

As adults, participants with enduring mental health reported, on average, more education, better jobs, higher-quality relationships and more satisfaction with their lives than their peers did. But lasting mental health doesn’t guarantee an exceptional sense of well-being, Schaefer says. Nearly one-quarter of never-diagnosed individuals scored below the entire sample’s average score for life satisfaction.

Less surprising was the 83 percent overall prevalence rate for mental disorders. That coincides with recent estimates from four other long-term projects. In those investigations — two in the United States, one in Switzerland and another in New Zealand — between 61 percent and 85 percent of participants developed mental disorders over 12- to 30-year spans.

Comparably high rates of emotional disorders were reported in 1962 for randomly selected Manhattan residents. Many researchers doubted those findings, which relied on a diagnostic system that was less strict than the three versions of psychiatry’s diagnostic manual that were introduced and used to evaluate New Zealand participants as they got older, says psychiatric epidemiologist William Eaton of Johns Hopkins Bloomberg School of Public Health. But the Manhattan study appears to have been on the right track, Eaton says.

Increased awareness that most people will eventually develop a mental disorder (SN: 10/10/09, p. 5), at least briefly, can reduce stigma attached to these conditions (SN Online: 10/13/16), he suspects.

Psychiatric epidemiologist Ronald Kessler suspects the numbers of people experiencing a mental disorder may be even higher than reported. Many participants deemed to have enduring mental health likely developed brief mental disorders that got overlooked, such as a couple of weeks of serious depression after a romantic breakup, says Kessler of Harvard Medical School, who directs U.S. surveys of mental disorders. Rather than focusing on rare cases of lasting mental health, “the more interesting thing is to compare people with persistent mental illness to those with temporary disorders,” he says.

How hydras know where to regrow their heads

Hydras, petite pond polyps known for their seemingly eternal youth, exemplify the art of bouncing back (SN: 7/23/16, p. 26). The animals’ cellular scaffolding, or cytoskeleton, can regrow from a slice of tissue that’s just 2 percent of the original hydra’s full body size. Researchers thought that molecular signals told cells where and how to rebuild, but new evidence suggests there are other forces at play.

Physicist Anton Livshits and colleagues at the Technion-Israel Institute of Technology in Haifa genetically engineered Hydra vulgaris specimens so that stretchy protein fibers called actins, which form the cytoskeleton, lit up under a microscope. Then, the team sliced and diced to look for mechanical patterns in the regeneration process.
Actin fibers in pieces of hydra exert mechanical force that lines up new cells and guides the growth of the animal’s head and tentacles, the researchers found. Turning off motor proteins that move actin stopped regeneration, and physically manipulating actin fiber alignment resulted in hydras with multiple heads. Providing hydras with further structural stability encouraged tissue slices to grow normally. Both mechanical and molecular forces may mold hydras in regeneration, the researchers report in the Feb. 7 Cell Reports.
When researchers anchored rings of hydra tissue to a wire (right), they found that the added mechanical stability made a hydra grow normally along one body axis, and thus grow one head. Without this stability, the actin scaffolding was more disrupted and the animal grew two heads (left).

Seagrasses boost ecosystem health by fighting bad bacteria

BOSTON — For a lawn that helps the environment — and doesn’t need to be mowed — look to the ocean. Meadows of underwater seagrass plants might lower levels of harmful bacteria in nearby ocean waters, researchers reported February 16 during a news conference at the annual meeting of the American Association for the Advancement of Science. That could make the whole ecosystem — from corals to fish to humans — healthier.

Not truly a grass, seagrasses are flowering plants with long, narrow leaves. They grow in shallow ocean water, spreading into vast underwater lawns. Seagrasses are “a marine powerhouse, almost equal to the rainforest. They’re one of the largest stores of carbon in the ocean,” says study coauthor Joleah Lamb, an ecologist at Cornell University. “But they don’t get a lot of attention.”
It’s no secret that seagrasses improve water quality, says James Fourqurean, a biologist at Florida International University in Miami who wasn’t involved in the research, which appears in the Feb. 17 Science. The plants are great at removing excess nitrogen and phosphorus from coastal waters. But now, it seems, they might take away harmful bacteria, too.

A few years ago, Lamb’s colleagues became ill with amoebic dysentery while studying coral reefs in Indonesia, an archipelagic nation that straddles the Indian and Pacific oceans. When a city or village on one of the country’s thousands of islands dumps raw sewage into the ocean, shoreline bacteria populations can spike to dangerous levels.
Water sampled close to the shores of four small and densely populated Indonesian islands had 10 times the U.S. Environmental Protection Agency’s recommended exposure limit of Enterococcusbacteria, which can cause illness in humans and often signals the presence of other pathogens. But water collected from offshore tidal flats and coral reefs with seagrass beds had lower levels of the bacteria compared with similar sites without the plants less than 20 meters away. The water had lower levels of numerous bacterial species that can make fish and marine invertebrates sick, too. And field surveys of more than 8,000 coral heads showed that those growing adjacent to or within seagrass beds had fewer diseases than those growing farther away.
It’s unclear how far from seagrass beds this cleaner water extends, but the benefits can ripple through the entire ecosystem, Lamb said at the news conference. Healthier corals help protect the islands from erosion. And fish less contaminated with bacteria make a better source of food for people.

Lamb is planning follow-up studies to figure out exactly how the seagrasses clean the water. Like a shag carpet, seagrasses trap small particulates drifting through the ocean and prevent them from flowing on. The plants might ensnare bacteria in the same way, building up biofilms on their blades. Or, she suggests, the leaves could be giving off antimicrobial compounds that directly kill the bacteria.

The findings are one more reason to conserve seagrasses, study coauthor Jeroen van de Water, an ecologist at the Scientific Center of Monaco, said at the news conference. Worldwide, seagrass beds are declining by 7 percent each year, thanks to pollution and habitat loss. And while restoration efforts are underway in some areas, “it’s better to stop what we’re doing to the meadows than to try to replant them,” Lamb added. “Seagrasses are quite particular in the depth they want to be at and the environment they want to have. It’s hard to start doing restoration projects if the environment isn’t exactly what the seagrass prefers.”

Hydrogen volcanoes might boost planets’ potential for life

Volcanoes that belch hydrogen could bump up the number of potentially habitable planets in the universe.

Ramses Ramirez and Lisa Kaltenegger, both of Cornell University, modeled the atmospheres of planets blemished with hydrogen-spewing volcanoes. These gaseous eruptions could warm planets and ultimately widen a star’s habitable zone, the region where liquid water can exist on a planet’s surface, by about 30 to 60 percent, researchers report in the March 1 Astrophysical Journal Letters. That would be like extending the outer edge of the sun’s habitable zone from about 254 million kilometers — just beyond Mars’ orbit — to 359 million kilometers, or roughly to the asteroid belt between Mars and Jupiter.

Exoplanets that astronomers had previously thought too cold to support life might, in fact, be ripe for habitability if they have hydrogen volcanoes, the researchers say. One example is TRAPPIST-1h, the farthest-out planet identified in an odd system of seven Earth-sized planets 39 light-years from Earth (SN Online: 2/22/17). That world is thought to be icy like Jupiter’s moon Europa.

Adding planets to a star’s habitable zone means more exotic worlds could be targets in the search for signatures of life beyond our solar system. Astronomers plan to search for these signatures with the James Webb Space Telescope, slated to launch in 2018, and later with the European Extremely Large Telescope, scheduled to begin operations in 2024.

Brain training turns recall rookies into memory masters

Just six weeks of training can turn average people into memory masters.

Boosting these prodigious mnemonic skills came with overhauls in brain activity, resulting in brains that behaved more like those of experts who win World Memory Championships competitions, scientists report March 8 in Neuron.

The findings are notable because they show just how remarkably adaptable the human brain is, says neuroscientist Craig Stark of the University of California, Irvine. “The brain is plastic,” he says. “Through use, it changes.”
It’s not yet clear how long the changes in the newly trained brains last, but the memory gains persisted for four months.

In an initial matchup, a group of 17 memory experts, people who place high in World Memory Championships, throttled a group of people with average memories. Twenty minutes after seeing a list of 72 words, the experts remembered an average of 70.8 words; the nonexperts caught, on average, only 39.9 words.

In subsequent matchups, some nonexperts got varying levels of help. Fifty-one novices were split into three groups. A third of these people spent six weeks learning the method of loci, a memorization strategy used by ancient Greek and Roman orators. To use the technique, a person must imagine an elaborate mental scene, such as a palace or a familiar walking path, and populate it with memorable items. New information can then be placed onto this scaffold, offering a way to quickly “see” long lists of items.

Other participants spent six weeks training to improve short-term memory, performing a tricky task that required people to simultaneously keep track of series of locations they see and numbers they hear. The rest of the participants had no training at all.

After the training, the people who learned the method of loci performed nearly as well as the memory experts. But the rest didn’t show such improvement. Study coauthor Martin Dresler, a neuroscientist at the Radboud University Medical Center in the Netherlands, knew that the method of loci works quite well; he wasn’t surprised to see those memory scores spike. To him, the more interesting changes happened in the trained people’s brains.
Before and after training, nonexperts underwent scans that pinpointed brain areas that were active at the same time, an indication that these brain areas work together closely. Dresler and colleagues looked at 2,485 connections in brain networks important for memory and visual and spatial thinking. Training in the method of loci seemed to reconfigure many of those connections, making some of the connections stronger and others weaker. The overall effect of training was to make brains “look like those of the world’s best memorizers,” Dresler says. The results suggest that large-scale changes across the brain, as opposed to changes in individual areas, drive the increased memory capacity.

These new memory skills were still obvious four months after training ended, particularly for the people whose brain behavior became more similar to that of the memory experts. The researchers didn’t scan participants’ brains four months out, so they don’t know whether the brain retains its reshaped connections. No such brain changes or big increases in memory skills were seen in the other groups.

Memorization techniques have been criticized as interesting tricks that have little use in real life. But “that’s not the case,” Dresler says. Boris Konrad, a coauthor of the study also at Radboud, is a memory master who trained in the method of loci. The technique “really helped him get much better grades” in physics and other complex studies, Dresler says.

Improvements in mnemonic memory, like other types of cognitive training, might not improve a broader range of thinking skills. The current study can’t answer bigger questions about whether brain training has more general benefits.

Smartphones may be changing the way we think

Not too long ago, the internet was stationary. Most often, we’d browse the Web from a desktop computer in our living room or office. If we were feeling really adventurous, maybe we’d cart our laptop to a coffee shop. Looking back, those days seem quaint.

Today, the internet moves through our lives with us. We hunt Pokémon as we shuffle down the sidewalk. We text at red lights. We tweet from the bathroom. We sleep with a smartphone within arm’s reach, using the device as both lullaby and alarm clock. Sometimes we put our phones down while we eat, but usually faceup, just in case something important happens.
Our iPhones, Androids and other smartphones have led us to effortlessly adjust our behavior. Portable technology has overhauled our driving habits, our dating styles and even our posture. Despite the occasional headlines claiming that digital technology is rotting our brains, not to mention what it’s doing to our children, we’ve welcomed this alluring life partner with open arms and swiping thumbs.

Scientists suspect that these near-constant interactions with digital technology influence our brains. Small studies are turning up hints that our devices may change how we remember, how we navigate and how we create happiness — or not.
Somewhat limited, occasionally contradictory findings illustrate how science has struggled to pin down this slippery, fast-moving phenomenon. Laboratory studies hint that technology, and its constant interruptions, may change our thinking strategies. Like our husbands and wives, our devices have become “memory partners,” allowing us to dump information there and forget about it — an off-loading that comes with benefits and drawbacks. Navigational strategies may be shifting in the GPS era, a change that might be reflected in how the brain maps its place in the world. Constant interactions with technology may even raise anxiety in certain settings.

Yet one large study that asked people about their digital lives suggests that moderate use of digital technology has no ill effects on mental well-being.

The question of how technology helps and hinders our thinking is incredibly hard to answer. Both lab and observational studies have drawbacks. The artificial confines of lab experiments lead to very limited sets of observations, insights that may not apply to real life, says experimental psychologist Andrew Przybylski of the University of Oxford. “This is a lot like drawing conclusions about the effects of baseball on players’ brains after observing three swings in the batting cage.”

Observational studies of behavior in the real world, on the other hand, turn up associations, not causes. It’s hard to pull out real effects from within life’s messiness. The goal, some scientists say, is to design studies that bring the rigors of the lab to the complexities of real life, and then to use the resulting insights to guide our behavior. But that’s a big goal, and one that scientists may never reach.

Evolutionary neurobiologist Leah Krubitzer is comfortable with this scientific ambiguity. She doesn’t put a positive or negative value on today’s digital landscape. Neither good nor bad, it just is what it is: the latest iteration on the continuum of changing environments, says Krubitzer, of the University of California, Davis.

“I can tell you for sure that technology is changing our brains,” she says. It’s just that so far, no one knows what those changes mean.

Of course, nearly everything changes the brain. Musical training reshapes parts of the brain. Learning the convoluted streets of London swells a mapmaking structure in the brains of cabbies. Even getting a good night’s sleep changes the brain. Every aspect of our environment can influence brain and behaviors. In some ways, digital technology is no different. Yet some scientists suspect that there might be something particularly pernicious about digital technology’s grip on the brain.

“We are information-seeking creatures,” says neuroscientist Adam Gazzaley of the University of California, San Francisco. “We are driven to it in very powerful ways.” Today’s digital tools give us unprecedented exposure to information that doesn’t wait for you to seek it out; it seeks you out, he says. That pull is nearly irresistible.

Despite the many unanswered questions about whether our digital devices are influencing our brains and behaviors, and whether for good or evil, technology is galloping ahead. “We should have been asking ourselves [these sorts of questions] in the ’70s or ’80s,” Krubitzer says. “It’s too late now. We’re kind of closing the barn doors after the horses got out.”
Attention grabber
One way in which today’s digital technology is distinct from earlier advances (like landline telephones) is the sheer amount of time people spend with it. In just a decade, smartphones have saturated the market, enabling instant internet access to an estimated 2 billion people around the world. In one small study reported in 2015, 23 adults, ages 18 to 33, spent an average of five hours a day on their phones, broken up into 85 distinct daily sessions. When asked how many times they thought they used their phones, participants underestimated by half.

In a different study, Larry Rosen, a psychologist at California State University, Dominguez Hills, used an app to monitor how often college students unlocked their phones. The students checked their phones an average of 60 times a day, each session lasting about three to four minutes for a total of 220 minutes a day. That’s a lot of interruption, Rosen says.
Smartphones are “literally omnipresent 24-7, and as such, it’s almost like an appendage,” he says. And often, we are compelled to look at this new, alluring rectangular limb instead of what’s around us. “This device is really powerful,” Rosen says. “It’s really influencing our behavior. It’s changed the way we see the world.”

Technology does that. Printing presses, electricity, televisions and telephones all shifted people’s habits drastically, Przybylski says. He proposes that the furor over digital technology melting brains and crippling social lives is just the latest incarnation of the age-old fear of change. “You have to ask yourself, ‘Is there something magical about the power of an LCD screen?’ ” Przybylski says.

Yet some researchers suspect that there is something particularly compelling about this advance. “It just feels different. Computers and the internet and the cloud are embedded in our lives,” says psychologist Benjamin Storm of the University of California, Santa Cruz. “The scope of the amount of information we have at our fingertips is beyond anything we’ve ever experienced. The temptation to become really reliant on it seems to be greater.”

Memory outsourcing
Our digital reliance may encourage even more reliance, at least for memory, Storm’s work suggests. Sixty college undergraduates were given a mix of trivia questions — some easy, some hard. Half of the students had to answer the questions on their own; the other half were told to use the internet. Later, the students were given an easier set of questions, such as “What is the center of a hurricane called?” This time, the students were told they could use the internet if they wanted.

People who had used the internet initially were more likely to rely on internet help for the second, easy set of questions, Storm and colleagues reported online last July in Memory. “People who had gotten used to using the internet continued to do so, even though they knew the answer,” Storm says. This kind of overreliance may signal a change in how people use their memory. “No longer do we just rely on what we know,” he says.
That work builds on results published in a 2011 paper in Science . A series of experiments showed that people who expected to have access to the internet later made less effort to remember things . In this way, the internet has taken the place formerly filled by spouses who remember birthdays, grandparents who remember recipes and coworkers who remember the correct paperwork codes — officially known as “transactive memory partners.”
“We are becoming symbiotic with our computer tools,” Betsy Sparrow, then at Columbia University, and colleagues wrote in 2011. “The experience of losing our internet connection becomes more and more like losing a friend. We must remain plugged in to know what Google knows.”

That digital crutch isn’t necessarily a bad thing, Storm points out. Human memory is notoriously squishy, susceptible to false memories and outright forgetting. The internet, though imperfect, can be a resource of good information. And it’s not clear, he says, whether our memories are truly worse, or whether we perform at the same level, but just reach the answer in a different way.

“Some people think memory is absolutely declining as a result of us using technology,” he says. “Others disagree. Based on the current data, though, I don’t think we can really make strong conclusions one way or the other.”

The potential downsides of this memory outsourcing are nebulous, Storm says. It’s possible that digital reliance influences — and perhaps even weakens — other parts of our thinking. “Does it change the way we learn? Does it change the way we start to put information together, to build our own stories, to generate new ideas?” Storm asks. “There could be consequences that we’re not necessarily aware of yet.”

Research by Gazzaley and others has documented effects of interruptions and multitasking, which are hard to avoid with incessant news alerts, status updates and Instagrams waiting in our pockets. Siphoning attention can cause trouble for a long list of thinking skills, including short- and long-term memory, attention, perception and reaction time. Those findings, however, come from experiments in labs that ask a person to toggle between two tasks while undergoing a brain scan, for instance. Similar effects have not been as obvious for people going about their daily lives, Gazzaley says. But he is convinced that constant interruptions — the dings and buzzes, our own restless need to check our phones — are influencing our ability to think.

Making maps
Consequences of technology are starting to show up for another cognitive task — navigating, particularly while driving. Instead of checking a map and planning a route before a trip, people can now rely on their smartphones to do the work for them. Anecdotal news stories describe people who obeyed the tinny GPS voice that instructed them to drive into a lake or through barricades at the entrance of a partially demolished bridge. Our navigational skills may be at risk as we shift to neurologically easier ways to find our way, says cognitive neuroscientist Véronique Bohbot of McGill University in Montreal.

Historically, getting to the right destination required a person to have the lay of the land, a mental map of the terrain. That strategy takes more work than one that’s called a “response strategy,” the type of navigating that starts with an electronic voice command. “You just know the response — turn right, turn left, go straight. That’s all you know,” Bohbot says. “You’re on autopilot.”
A response strategy is easier, but it leaves people with less knowledge. People who walked through a town in Japan with human guides did a better job later navigating the same route than people who had walked with GPS as a companion, researchers have found.

Scientists are looking for signs that video games, which often expose people to lots of response-heavy situations, influence how people get around. In a small study, Bohbot and colleagues found that people who average 18 hours a week playing action video games such as Call of Duty navigated differently than people who don’t play the games. When tested on a virtual maze, players of action video games were more likely to use the simpler response learning strategy to make their way through, Bohbot and colleagues reported in 2015 in Proceedings of the Royal Society B.

That easier type of response navigation depends on the caudate nucleus, a brain area thought to be involved in habit formation and addiction. In contrast, nerve cells in the brain’s hippocampus help create mental maps of the world and assist in the more complex navigation. Some results suggest that people who use the response method have bigger caudate nuclei, and more brain activity there. Conversely, people who use spatial strategies that require a mental map have larger, busier hippocampi.

Those results on video game players are preliminary and show an association within a group that may share potentially confounding similarities. Yet it’s possible that getting into a habit of mental laxity may change the way people navigate. Digital technology isn’t itself to blame, Bohbot says. “It’s not the technology that’s necessarily good or bad for our brain. It’s how we use the technology,” she says. “We have a tendency to use it in the way that seems to be easiest for us. We’re not making the effort.”

Parts of the brain, including those used to navigate, have many jobs. Changing one aspect of brain function with one type of behavior might have implications for other aspects of life. A small study by Bohbot showed that people who navigate by relying on the addiction-related caudate nucleus smoke more cigarettes, drink more alcohol and are more likely to use marijuana than people who rely on the hippocampus. What to make of that association is still very much up in the air.

Sweating the smartphone
Other researchers are trying to tackle questions of how technology affects our psychological outlooks. Rosen and colleagues have turned up clues that digital devices have become a new source of anxiety for people.
In diabolical experiments, Cal State’s Rosen takes college students’ phones away, under the ruse that the devices are interfering with laboratory measurements of stress, such as heart rate and sweating. The phones are left on, but placed out of reach of the students, who are reading a passage. Then, the researchers start texting the students, who are forced to listen to the dings without being able to see the messages or respond. Measurements of anxiety spike, Rosen has found, and reading comprehension dwindles.

Other experiments have found that heavy technology users last about 10 minutes without their phones before showing signs of anxiety.

Fundamentally, an interruption in smartphone access is no different from those in the days before smartphones, when the landline rang as you were walking into the house with bags full of groceries, so you missed the call. Both situations can raise anxiety over a connection missed. But Rosen suspects that our dependence on digital technology causes these situations to occur much more often.

“The technology is magnificent,” he says. “Having said that, I think that this constant bombardment of needing to check in, needing to be connected, this feeling of ‘I can’t be disconnected, I can’t cut the tether for five minutes,’ that’s going to have a long-term effect.”

The question of whether digital technology is good or bad for people is nearly impossible to answer, but a survey of 120,000 British 15-year-olds (99.5 percent reported using technology daily) takes a stab at it. Oxford’s Przybylski and Netta Weinstein at Cardiff University in Wales have turned up hints that moderate use of digital technology — TV, computers, video games and smartphones — correlates with good mental health, measured by questions that asked about happiness, life satisfaction and social activity.

When the researchers plotted technology use against mental well-being, an umbrella-shaped curve emerged, highlighting what the researchers call the “Goldilocks spot” of technology use — not too little and not too much.

“We found that you’ve got to do a lot of texting before it hurts,” Przybylski says. For smartphone use, the shift from benign to potentially harmful came after about two hours of use on weekdays, mathematical analyses revealed. Weekday recreational computer use had a longer limit: four hours and 17 minutes, the researchers wrote in the February Psychological Science.
For even the heaviest users, the relationship between technology use and poorer mental health wasn’t all that strong. For scale, the potential negative effects of all that screen time was less than a third of the size of the positive effects of eating breakfast, Przybylski and Weinstein found.

Even if a relationship is found between technology use and poorer mental health, scientists still wouldn’t know why, Przybylski says. Perhaps the effect comes from displacing something, such as exercise or socializing, and not the technology itself.

We may never know just how our digital toys shape our brains. Technology is constantly changing, and fast. Our brains are responding and adapting to it.

“The human neocortex basically re-creates itself over successive generations,” Krubitzer says. It’s a given that people raised in a digital environment are going to have brains that reflect that environment. “We went from using stones to crack nuts to texting on a daily basis,” she says. “Clearly the brain has changed.”

It’s possible that those changes are a good thing, perhaps better preparing children to succeed in a fast-paced digital world. Or maybe we will come to discover that when we no longer make the effort to memorize our best friend’s phone number, something important is quietly slipping away.

It’s time to redefine what qualifies as a planet, scientists propose

Pluto is a planet. It always has been, and it always will be, says Will Grundy of Lowell Observatory in Flagstaff, Arizona. Now he just has to convince the world of that.

For centuries, the word planet meant “wanderer” and included the sun, the moon, Mercury, Venus, Mars, Jupiter and Saturn. Eventually the moon and sun were dropped from the definition, but Pluto was included, after its discovery in 1930. That idea of a planet as a rocky or gaseous body that orbited the sun stuck, all the way up until 2006.
Then, the International Astronomical Union narrowed the definition, describing a planet as any round object that orbits the sun and has moved any pesky neighbors out of its way, either by consuming them or flinging them off into space. Pluto failed to meet the last criterion (SN: 9/2/06, p. 149), so it was demoted to a dwarf planet.

Almost overnight, the solar system was down to eight planets. “The public took notice,” Grundy says. It latched onto the IAU’s definition — perhaps a bit prematurely. The definition has flaws, he and other planetary scientists argue. First, it discounts the thousands of exotic worlds that orbit other stars and also rogue ones with no star to call home (SN: 4/4/15, p. 22).

Second, it requires that a planet cut a clear path around the sun. But no planet does that; Earth, Mars, Jupiter and Neptune share their paths with asteroids, and objects crisscross planets’ paths all the time.

The third flaw is related to the second. Objects farther from the sun need to be pretty bulky to cut a clear path. You could have a rock the size of Earth in the Kuiper Belt and it wouldn’t have the heft required to gobble down or eject objects from its path. So, it couldn’t be considered a planet.

Grundy and colleagues (all members of NASA’s New Horizons mission to Pluto) laid out these arguments against the IAU definition of a planet March 21 at the Lunar and Planetary Science Conference in The Woodlands, Texas.
A more suitable definition of a planet, says Grundy, is simpler: It’s any round object in space that is smaller than a star. By that definition, Pluto is a planet. So is the asteroid-belt object Ceres. So is Earth’s moon. “There’d be about 110 known planets in our solar system,” Grundy says, and plenty of exoplanets and rogue worlds would fit the bill as well.

The reason for the tweak is to keep the focus on the features — the physics, the geology, the atmosphere — of the world itself, rather than worry about what’s going on around it, he says.

The New Horizons mission has shown that Pluto is an interesting world with active geology, an intricate atmosphere and other features associated with planets in the solar system. It makes no sense to write Pluto off because it doesn’t fit one criterion. Grundy seems convinced the public could easily readopt the small world as a planet. Though he admits astronomers might be a tougher sell.

“People have been using the word correctly all along,” Grundy says. He suggests we stick with the original definition. That’s his plan.