Hexbyte Glen Cove River beds that can shift naturally are more efficient carbon sinks than straightened rivers

Hexbyte Glen Cove

Rio Bermejo. Credit: Kristen Cook

It takes about 8500 years for a grain of sand from the Andes to be washed across the Argentine lowlands into the Río Paraná. The 1200-kilometer journey in the river called Río Bermejo is interrupted by many stops in river floodplains, where the grain is deposited, sometimes over thousands of years, and then washed free again. The sand is accompanied by organic carbon, washed in from soil and plants. The transport in water thus gains relevance for the climate: Rivers carry the carbon, which was previously removed from the atmosphere via photosynthesis, as sediment into the sea, where it is stored for thousands of years without harming the climate.

Researchers at the GFZ German Research Centre for Geosciences have now quantified the individual processes of the journey for the first time and report on them in the journal Nature Geoscience. An important result of the work: It is in particular undisturbed meandering sections of a river where carbon is deposited and reabsorbed, and then transported further into the sea. In river sections with straight, stable banks, on the other hand, only the suspended particle load passes through, while the carbon in the river floodplains is slowly decomposed again to CO2 by microorganisms. GFZ working group leader Dirk Sachse says that “the Río Bermejo was an ideal natural laboratory for us because it has no significant tributaries.” Sachse is also director of the “Landscapes of the Future” topic in the Helmholtz program “Changing Earth—Sustaining Our Future.” He says that “this means that natural river courses that have space to erode floodplains can remove more carbon from the atmosphere than straight river sections. In this respect, straightening of by humans could also contribute to the increase in atmospheric CO2 concentration. What’s exciting now is answering the question of whether we can help the climate by giving rivers more space again and not impeding natural river meandering.”

The international team led by first author Marisa Repasch of GFZ studied the processes in the river and its floodplains with a diverse set of instruments. Analyses of cosmogenic beryllium-10 content, for example, indicated the duration of sediment transport. Dating based on the unstable carbon isotope 14C, in turn, allowed conclusions to be drawn about the age of the particles of organic origin. During fieldwork in Argentina, samples were taken from the river at multiple stations along the source-to-sink pathway. “Naturally meandering rivers erode material from floodplains and transport it to the sea, where it remains for a long time,” says Marisa Repasch, summarizing the results, “in contrast, artificially stabilized river courses are far less effective carbon sinks.”



More information:
Marisa Repasch et al, Fluvial organic carbon cycling regulated by sediment transit time and mineral protection, Nature Geoscience (2021). DOI: 10.1038/s41561-021-00845-7

Citation:
River beds that can shift naturally are more efficient carbon sinks than straightened rivers (2021, October 29)
retrieved 30 October 2021
from https://phys.org/news/2021-10-river-be

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Detector advance could lead to cheaper, easier medical scans

Hexbyte Glen Cove

New ultrafast photon detectors allow for rapid processing of data from positron emission or X-ray scans without the need for tomography to reconstruct images. This image shows a brain phantom (model) scanned by positron emission using the new technology. Credit: Simon Cherry, UC Davis

Researchers in the U.S. and Japan have demonstrated the first experimental cross-sectional medical image that doesn’t require tomography, a mathematical process used to reconstruct images in CT and PET scans . The work, published Oct. 14 in Nature Photonics, could lead to cheaper, easier and more accurate medical imaging.

The advance was made possible by development of new, ultrafast photon detectors, said Simon Cherry, professor of biomedical engineering and of radiology at the University of California, Davis and senior author on the paper.

“We’re literally imaging at the speed of light, which is something of a holy grail in our field,” Cherry said.

Experimental work was led by Sun Il Kwon, project scientist in the UC Davis Department of Biomedical Engineering and Ryosuke Ota at Hamamatsu Photonics, Japan, where the new detector technology was developed. Other collaborators included research groups led by Professor Yoichi Tamagawa at the University of Fukui, and by Professor Tomoyuki Hasegawa at Kitasato University.

The process of tomography is required to mathematically reconstruct cross-sectional images from the data in imaging that uses X-rays or gamma rays. In PET scans, molecules tagged with trace amounts of a radioactive isotope are injected and taken up by organs and tissues in the body. The isotope, such as fluorine-18, is unstable and emits positrons as it decays.

Ultrafast photon detection

Whenever one of these positrons encounters an electron in the body, they annihilate each other and simultaneously give off two annihilation photons. Tracking the origin and trajectory of these photons theoretically creates an image of the tissues tagged with isotopes. But until now, researchers were unable to do that without the extra step of tomographic reconstruction, because detectors were too slow to precisely determine the arrival times of two photons and thus pinpoint their location based on their .

When the annihilation photons strike the detector, they generate Cherenkov photons that produce the signal. Cherry and his fellow researchers figured out how to detect these Cherenkov photonså with an average timing precision of 32 picoseconds. This meant they could determine where the annihilation photons arose with a spatial precision of 4.8 millimeters. This level of speed and accuracy enabled the research team to produce cross-sectional images of a radioactive isotope directly from the annihilation photons without having to use tomography.

In their paper, the researchers describe various tests they conducted with their new technique, including on a test object that mimics the human brain. They feel confident that this procedure is ultimately scalable to the level needed for clinical diagnostics and has the potential to create higher quality images using a lower radiation dose. Images can also be created more quickly with this method, potentially even in real time during the PET scan, as no after-the-fact reconstruction is needed.

PET scans are currently expensive and are technically limited in some ways, as the full information present in the travel time of the annihilation photons is not captured by current clinical scanners. This new discovery involves a compact equipment setup and could lead to inexpensive, easy and accurate scans of the human body using radioactive isotopes.

Additional coauthors are: Eric Berg at UC Davis; Fumio Hashimoto and Tomohide Omura, Hamamatsu Photonics; Kyohei Nakajima and Izumi Ogawa, University of Fukui.



More information:
Sun Il Kwon et al, Ultrafast timing enables reconstruction-free positron emission imaging, Nature Photonics (2021). DOI: 10.1038/s41566-021-00871-2

Citation:
Detector advance could lead to cheaper, easier medical scans (2021, October 29)
retrieved 30 October 2021
from https://phys.org/news/2021-10-detector-advance-cheaper-easier-medical.html

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Is it worth trying to sway the most staunch climate deniers?

Hexbyte Glen Cove

Boston University researcher finds some people who are receptive to disinformation about climate change are still open to the science. Credit: Callum Shaw on Unsplash

Thanks to algorithms that learn about social media users’ content preferences, Facebook timelines, Twitter feeds, suggested YouTube videos, and other news streams can look startlingly different from one person’s online account to the next. Media and communication experts often wrestle with how to rein in the forces that further polarize people with different views, especially people who sit on opposite sides of the political aisle. When it comes to online content that contains disinformation—inaccurate messages or propaganda intended to deceive and influence readers—why are some people more likely to believe falsehoods often spread via social media and the internet?

Arunima Krishna, a Boston University College of Communication researcher who studies public perceptions of controversial social issues, is studying the spread of , specifically related to —an issue that has been manipulated by deniers for decades. In her latest study, Krishna surveyed 645 Americans about their beliefs on change—whether or not those beliefs are informed by fact or fiction—to assess their communication behaviors about climate change.

“I think a lot of folks don’t see how close to home climate change is. Even though we’re seeing climate refugees, [worsening] hurricanes, and other [natural] disasters, there is still a level of distance from the problem,” says Krishna, a College of Communication assistant professor of public relations.

She points out that physical distance from the effects of climate change could be partly why some people find it is easier to separate themselves from the climate crisis. Plus, climate solutions are often things many people don’t readily want to do, like eating less meat, using less plastic, and buying less material goods. Fossil fuel companies and lobbyists for the industry have also worked extremely hard to deceive the public from knowing the full extent of the damaging impact of burning fossil fuels, she says.

According to Krishna’s survey of Americans, 7 in 10 people who are susceptible to believing climate disinformation self-identified as politically conservative. In contrast, 8 in 10 Americans who self-identified as liberal were found to be immune to disinformation about climate change. Those findings double down on past research from the Yale Program on Climate Change Communication, which found liberals and Democrats are significantly more worried about climate change than conservatives and Republicans, and are more likely to believe humans are causing the climate crisis.

Krishna also detected a difference in age between those who were more susceptible to disinformation and those who weren’t. More than half of the respondents immune to false information about climate were under 45. Those more receptive to climate disinformation were, on average, over the age of 46.

Diving deeper into the respondents’ responses, Krishna categorized the survey results into four different groups. The first segment, made up of people she calls the “disinformation immune,” have not accepted any disinformation about climate change and humans’ role in it, and they likely never will. The second group, the”disinformation vulnerable,” have negative attitudes about how humans are influencing climate. While they haven’t yet accepted disinformation, some of their responses to facts about climate change—as well as their attitudes and motivations—indicate they could possibly believe climate disinformation in the future. The third group, the “disinformation receptive,” have accepted false information about climate change already. Lastly, the fourth group, the “disinformation amplifying,” is made up of people who hold extremely negative attitudes about climate change and doubt humans’ role in accelerating it, have already accepted disinformation, and are highly motivated to spread the disinformation they believe.

“My study found that [disinformation amplifiers] are more likely to spread their opinions about climate change compared to everybody else in the survey,” Krishna says. The amplifiers are known as what Krishna calls “lacuna publics,” a term she coined in 2017 when she was researching vaccine hesitant groups. (The word “publics” refers to groups connected by issue-specific motivation, and “lacuna” means a gap in knowledge.) Though the disinformation amplifiers, or lacuna publics, are in the minority, they are different from groups that are disinformation vulnerable or receptive because of their willingness to spread disinformation.

The United States has more climate skeptics than anywhere else in the world, Krishna says, but their ranks have started to shrink. Climate scientists around the world have found unequivocally that the more we continue to emit heat-trapping greenhouse gases into the atmosphere, the worse the consequences will be for humans, most species, and ecosystems on Earth.

Though there is no single solution to stopping the spread of climate disinformation, Krishna emphasizes the importance of engaging with people most vulnerable to believing disinformation. Lacuna publics, or amplifiers, however, might be difficult or impossible to sway.

“It might not be worth using resources to try to reach the lacuna publics,” Krishna says. “Research tells us that one-on-one interaction can often be more effective than mass media messages…so perhaps that’s the best way to [elevate] voices who are disinformation immune.”



More information:
Arunima Krishna, Lacuna publics: advancing a typology of disinformation-susceptible publics using the motivation-attitude-knowledge framework, Journal of Public Relations Research (2021). DOI: 10.1080/1062726X.2021.1944155

Citation:

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Researchers develop a new way to control and measure energy levels in a diamond crystal

Hexbyte Glen Cove

Caption:Instrumentation setup in the Quantum Engineering Group at MIT to study dynamical symmetries with qubits in diamond crystals Credit: Guoqing Wang/MIT

Physicists and engineers have long been interested in creating new forms of matter, those not typically found in nature. Such materials might find use someday in, for example, novel computer chips. Beyond applications, they also reveal elusive insights about the fundamental workings of the universe. Recent work at MIT both created and characterized new quantum systems demonstrating dynamical symmetry—particular kinds of behavior that repeat periodically, like a shape folded and reflected through time.

“There are two problems we needed to solve,” says Changhao Li, a graduate student in the lab of Paola Cappellaro, a professor of nuclear science and engineering. Li published the work recently in Physical Review Letters, together with Cappellaro and fellow graduate student Guoqing Wang. “The first problem was that we needed to engineer such a system. And second, how do we characterize it? How do we observe this symmetry?”

Concretely, the quantum system consisted of a diamond crystal about a millimeter across. The crystal contains many imperfections caused by a next to a gap in the lattice—a so-called nitrogen-vacancy center. Just like an electron, each center has a quantum property called a spin, with two discrete . Because the system is a quantum system, the spins can be found not only in one of the levels, but also in a combination of both energy levels, like Schrodinger’s theoretical cat, which can be both alive and dead at the same time.

The energy level of the system is defined by its Hamiltonian, whose periodic time dependence the researchers engineered via microwave control. The system was said to have dynamical symmetry if its Hamiltonian was the same not only after every time period t but also after, for example, every t/2 or t/3, like folding a piece of paper in half or in thirds so that no part sticks out. Georg Engelhardt, a postdoc at the Beijing Computational Science Research, who was not involved in this work but whose own theoretical work served as a foundation, likens the symmetry to guitar harmonics, in which a string might vibrate at both 100 hertz and 50 Hz.

To induce and observe such dynamical symmetry, the MIT team first initialized the system using a laser pulse. Then they directed various selected frequencies of microwave radiation at it and let it evolve, allowing it to absorb and emit the energy. “What’s amazing is that when you add such driving, it can exhibit some very fancy phenomena,” Li says. “It will have some periodic shake.” Finally, they shot another laser pulse at it and measured the visible light that it fluoresced, in order to measure its state. The measurement was only a snapshot, so they repeated the experiment many times to piece together a kind of flip book that characterized its behavior across time.

Dynamical symmetries, which play an essential role in physics, are engineered and characterized by a cutting-edge quantum information processing toolkit. Credit: Image courtesy of the researchers.

“What is very impressive is that they can show that they have this incredible control over the ,” Engelhardt says. “It’s quite easy to solve the equation, but realizing this in an experiment is quite difficult.”

Critically, the researchers observed that the dynamically symmetry of the Hamiltonian—the harmonics of the system’s energy level—dictated which transitions could occur between one state and another. “And the novelty of this work,” Wang says, “is also that we introduce a tool that can be used to characterize any quantum information platform, not just nitrogen-vacancy centers in diamonds. It’s broadly applicable.” Li notes that their technique is simpler than previous methods, those that require constant laser pulses to drive and measure the system’s periodic movement.

One engineering application is in quantum computers, systems that manipulate qubits, bits that can be not only 0 or 1, but a combination of 0 and 1. A diamond’s spin can encode one qubit in its two energy levels.

Qubits are delicate: they easily break down into simple bit, a 1 or a 0. Or the qubit might become the wrong combination of 0 and 1. “These tools for measuring dynamical symmetries,” Engelhardt says, “can be used to as a sanity check that your experiment is tuned correctly—and with a very high precision.” He notes the problem of outside perturbations in quantum computers, which he likens to a de-tuned guitar. By tuning the tension of the strings—adjusting the microwave radiation—such that the harmonics match some theoretical symmetry requirements, one can be sure that the experiment is perfectly calibrated.

The MIT team already has their sights set on extensions to this work. “The next step is to apply our method to more complex systems and study more interesting physics,” Li says. They aim for more than two energy levels—three, or 10, or more. With more energy levels they can represent more qubits. “When you have more qubits, you have more complex symmetries,” Li says. “And you can characterize them using our method here.”



More information:
Guoqing Wang et al, Observation of Symmetry-Protected Selection Rules in Periodically Driven Quantum Systems, Physical Review Letters (2021). DOI: 10.1103/PhysRevLett.127.140604

This story is republished courtesy of MIT News (web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.

Citation:
Researchers develop a new way to control and measure energy levels in a diamond crystal (2021, October 28)
retrieved 29 October 2021
from https://phys.org/news/2021-10-energy-diamond-crystal.html

This document is subject to copyright. Apart from any fair dealing for the

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Researchers set ‘ultrabroadband’ record with entangled photons

Hexbyte Glen Cove

Researchers in the lab of Qiang Lin at the University of Rochester have generated record ‘ultrabroadband’ bandwidth of entangled photons using the thin-film nanophotonic device illustrated here. At top left, a laser beam enters a periodically poled thin-film lithium niobate waveguide (banded green and gray). Entangled photons (purple and red dots) are generated with a bandwidth exceeding 800 nanometers. Credit: Usman Javi and Michael Osadciw

Quantum entanglement—or what Albert Einstein once referred to as “spooky action at a distance”— occurs when two quantum particles are connected to each other, even when millions of miles apart. Any observation of one particle affects the other as if they were communicating with each other. When this entanglement involves photons, interesting possibilities emerge, including entangling the photons’ frequencies, the bandwidth of which can be controlled.

Researchers at the University of Rochester have taken advantage of this phenomenon to generate an incredibly large bandwidth by using a thin-film nanophotonic device they describe in Physical Review Letters.

The breakthrough could lead to:

  • Enhanced sensitivity and resolution for experiments in metrology and sensing, including spectroscopy, nonlinear microscopy, and quantum optical coherence tomography
  • Higher dimensional encoding of information in quantum networks for and communications

“This work represents a major leap forward in producing ultrabroadband on a nanophotonic chip,” says Qiang Lin, professor of electrical and computer engineering. “And it demonstrates the power of nanotechnology for developing future quantum devices for communication, computing, and sensing,”

No more tradeoff between bandwidth and brightness

To date, most devices used to generate broadband entanglement of light have resorted to dividing up a bulk crystal into small sections, each with slightly varying optical properties and each generating different frequencies of the photon pairs. The frequencies are then added together to give a larger bandwidth.

“This is quite inefficient and comes at a cost of reduced brightness and purity of the photons,” says lead author Usman Javid, a Ph.D. student in Lin’s lab. In those devices, “there will always be a tradeoff between the bandwidth and the brightness of the generated photon pairs, and one has to make a choice between the two. We have completely circumvented this tradeoff with our dispersion engineering technique to get both: a record-high bandwidth at a record-high brightness.”

The thin-film lithium niobate nanophotonic device created by Lin’s lab uses a single waveguide with electrodes on both sides. Whereas a bulk device can be millimeters across, the thin-film device has a thickness of 600 nanometers—more than a million times smaller in its cross-sectional area than a bulk crystal, according to Javid. This makes the propagation of light extremely sensitive to the dimensions of the waveguide.

Indeed, even a variation of a few nanometers can cause significant changes to the phase and group velocity of the light propagating through it. As a result, the researchers’ thin-film device allows precise control over the bandwidth in which the pair-generation process is momentum-matched. “We can then solve a parameter optimization problem to find the geometry that maximizes this ,” Javid says.

The device is ready to be deployed in experiments, but only in a lab setting, Javid says. In order to be used commercially, a more efficient and cost-effective fabrication process is needed. And although lithium niobate is an important material for light-based technologies, fabrication is “still in its infancy, and it will take some time to mature enough to make financial sense,” he says.

Other collaborators include coauthors Jingwei Ling, Mingxiao Li, and Yang He of the Department of Electrical and Computer Engineering, and Jeremy Staffa of the Institute of Optics, all of whom are graduate students. Yang He is a postdoctoral researcher.



More information:
Usman A. Javid et al, Ultrabroadband Entangled Photons on a Nanophotonic Chip, Physical Review Letters (2021). DOI: 10.1103/PhysRevLett.127.183601

Citation:
Researchers set ‘ultrabroadband’ record with entangled photons (2021, October 28)
retrieved 29 October 2021
from https://phys.org/news/2021-10-ultrabroadband-entangled-photons.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Scientists identify G-Exos as a nanocarrier for miRNA transfer to stimulate neural differentiation of stem cells

Hexbyte Glen Cove

Credit: CC0 Public Domain

Differentiation of bone marrow-derived mesenchymal stem cells (BMSCs) into functional neural cells is of tremendous significance to treat neural diseases. However, the limited neural differentiation of BMSCs remains a major challenge. Recent studies suggest that miRNAs may play a crucial role in regulating the neural differentiation of stem cells as effective signaling molecules. Due to their unique physiochemical attributes, miRNAs cannot be efficiently delivered in vitro and in vivo, thus inhibiting the application and translation of miRNAs.

Peng Lihua at the Zhejiang University College of Pharmaceutical Sciences has led cutting-edge research into ginseng-derived exosomes (G-Exos) as a carrier for miRNA transfer in the neural differentiation of BMSCs. The research findings appear in a research article entitled “Plant Exosomes as Novel Nanoplatforms for MicroRNA Transfer Stimulate Neural Differentiation of Stem Cells in Vitro and in Vivo” in the journal Nano Letters.

In this study, researchers isolated Exos from the juice of ginseng and loaded chemokine and G-Exos onto photo-cross-linkable hydrogel to fabricate a convenient, safe and efficient multi-purpose wound regeneration gel dressing, thereby achieving the targeted recruitment and induced neural differentiation of .

“This study demonstrated that G-Exos, which could circumvent the limitations of conventional RNA transfer strategies, may well become an effective nanoplatform in transferring plant-derived miRNAs to mammalian stem cells for neural differentiation both in vitro and in vivo, thus holding great promise in neural regenerative medicine,” said Prof. Peng.



More information:
Xue-Han Xu et al, Plant Exosomes As Novel Nanoplatforms for MicroRNA Transfer Stimulate Neural Differentiation of Stem Cells In Vitro and In Vivo, Nano Letters (2021). DOI: 10.1021/acs.nanolett.1c02530

Provided by
Zhejiang University

Citation:
Scientists identify G-Exos as a nanocarrier for miRNA transfer to stimulate neural differentiation of stem cells (2021, October 28)
retrieved 29 October 2021
from https://phys.org/news/2021-10-scientists-g-exos-nanocarrier-mirna-neural.html

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Atmospheric river storms can drive costly flooding, and climate change is making them stronger

Hexbyte Glen Cove

NOAA” data-thumb=”https://scx1.b-cdn.net/csz/news/tmb/2021/atmospheric-river-stor.jpg”>

NOAA” width=”800″>
Satellite photo showing a river of moisture extending from Hawaii to Calfiornia, Oct. 24, 2021. Credit: NOAA

Ask people to name the world’s largest river, and most will probably guess that it’s the Amazon, the Nile or the Mississippi. In fact, some of Earth’s largest rivers are in the sky—and they can produce powerful storms, like the ones now drenching northern California.

Atmospheric rivers are long, narrow bands of moisture in the atmosphere that extend from the tropics to higher latitudes. These rivers in the sky can transport 15 times the volume of the Mississippi River.

When that moisture reaches the coast and moves inland, it rises over the mountains, generating rain and snowfall. Many fire-weary westerners welcome these deluges, but can trigger other disasters, such as extreme flooding and debris flows.

In the past 20 years, as observation networks have improved, scientists have learned more about these important weather phenomena. Atmospheric rivers occur globally, affecting the west coasts of the world’s major land masses, including Portugal, Western Europe, Chile and South Africa. So-called “Pineapple Express” storms that carry moisture from Hawaii to the U.S. West Coast are just one of their many flavors.

My research combines economics and atmospheric science to measure damage from . Recently I led a team of researchers from Scripps Institution of Oceanography and the Army Corps of Engineers in the first systematic analysis of damages from atmospheric rivers due to extreme flooding. We found that while many of these events are benign, the largest of them cause most of the flooding damage in the western U.S. And atmospheric rivers are predicted to grow longer, wetter and wider in a warming climate.

Rivers in the sky

On Feb. 27, 2019, an atmospheric river propelled a plume of water vapor 350 miles wide and 1,600 miles long through the sky from the tropical North Pacific Ocean to the coast of Northern California.

NOAA” data-thumb=”https://scx1.b-cdn.net/csz/news/tmb/2021/atmospheric-river-stor-1.jpg”>

NOAA”>
Atmospheric rivers are an important water source for the U.S. West. Credit: NOAA

Just north of San Francisco Bay, in Sonoma County’s famed wine country, the dumped over 21 inches of rain. The Russian River crested at 45.4 feet—13.4 feet above flood stage.

For the fifth time in four decades, the town of Guerneville was submerged under the murky brown floodwaters of the lower Russian River. Damages in Sonoma County alone were estimated at over US$100 million.

Events like these have drawn attention in recent years, but atmospheric rivers are not new. They have meandered through the sky for millions of years, transporting water vapor from the equator toward the poles.

In the 1960s meteorologists coined the phrase “Pineapple Express” to describe storm tracks that originated near Hawaii and carried warm water vapor to the coast of North America. By the late 1990s atmospheric scientists had found that over 90% of the world’s moisture from the tropics and subtropics was transported to higher latitudes by similar systems, which they named “atmospheric rivers.”

In dry conditions, atmospheric rivers can replenish water supplies and quench dangerous wildfires. In wet conditions, they can cause damaging floods and debris flows, wreaking havoc on local economies.

Helpful and harmful

Researchers have known for some time that flooding due to atmospheric rivers could cost a lot of money, but until our study no one had quantified these damages. We used a catalog of atmospheric river events compiled by Scripps Institution of Oceanography’s Center for Western Weather and Water Extremes, and matched it to 40 years of flood insurance records and 20 years of National Weather Service damage estimates.

NASA Earth Observatory” data-thumb=”https://scx1.b-cdn.net/csz/news/tmb/2021/atmospheric-river-stor-2.jpg”>

NASA Earth Observatory”>
After an atmospheric river event that caused severe flooding in Chile, sediment washed down from hillsides into the Itata River can be seen flowing up to 50 kilometers from the coast. Credit: NASA Earth Observatory

We found that atmospheric rivers caused an average of $1.1 billion in flood damages yearly in the western U.S. More than 80% of all flooding damages in the West in the years we studied were associated with atmospheric rivers. In some areas, such as coastal northern California, these systems caused over 99% of damages.

Our data showed that in an average year, about 40 atmospheric rivers made landfall along the Pacific coast somewhere between Baja California and British Columbia. Most of these events were benign: About half caused no insured losses, and these storms replenished the region’s water supply.

But there were a number of exceptions. We used a recently developed atmospheric river classification scale that ranks the storms from 1 to 5, similar to systems for categorizing hurricanes and tornadoes. There was a clear link between these categories and observed damages.

Flooding, rock slides, chain controls, overturned vehicles—and that was just this morning. This atmospheric river storm is expected to intensify with heavy rain and significant snow into tomorrow. Do NOT drive if you don’t have to. Follow @NWSSacramento @NWSReno for updates. pic.twitter.com/68Xf8fAZPw

— Caltrans District 3 (@CaltransDist3) October 24, 2021

Atmospheric River category 1 (AR1) and AR2 storms caused estimated damages under $1 million. AR4 and AR5 storms caused median damages in the 10s and 100s of millions of dollars respectively. The most damaging AR4s and AR5s generated impacts of over $1 billion per storm. These billion-dollar storms occurred every three to four years.

A moister atmosphere means worse storms

Our most significant finding was an exponential relationship between the intensity of atmospheric rivers and the flood damages they caused. Each increase in the scale from 1 to 5 was associated with a 10-fold increase in damages.






Scientists have developed a scale for categorizing atmospheric rivers that reflect both their replenishing capacities and their dangerous effects.

Several recent studies have modeled how atmospheric rivers will change in the coming decades. The mechanism is simple: Greenhouse gases trap heat in the atmosphere, warming the planet. This causes more water to evaporate from oceans and lakes, and increased moisture in the air makes storm systems grow stronger.

Like hurricanes, atmospheric rivers are projected to grow longer, wider and wetter in a warming climate. Our finding that damages increase exponentially with intensity suggests that even modest increases in atmospheric river intensity could lead to significantly larger economic impacts.

Better forecasting is critical

I believe that improving atmospheric forecasting systems should be a priority for adapting to a changing climate. Better understanding of atmospheric rivers’ intensity, duration and landfall locations can provide valuable information to residents and emergency responders.

It also is important to discourage new construction in high-risk areas and help people move to safer locations after major disasters, rather than rebuilding in place.

Finally, our study underlines the need to reduce global greenhouse gas emissions. These storms will keep coming, and they’re getting stronger. In my view, stabilizing the global climate system is the only long-term way to minimize economic damage and risk to vulnerable communities.



This article is republished from The Conversation under a Creative Commons license. Read the original article.

Citation:
Atmospheric river storms can drive costly flooding, and climate change is making them stronger (2021, October 28)
retrieved 28 October 2021
from https://phys.org/news/2021-10-atmospheric-river-storms-costly-climate.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove New research shows most people are honest—except for a few

Hexbyte Glen Cove

Credit: Pixabay/CC0 Public Domain

People are more honest in day-to-day life than previously thought—except for “a few prolific liars,” according to the results of a new study.

There has been an assumption for a long time that people lie a lot, and that lying is ubiquitous and really common, says the University of Alabama at Birmingham’s Timothy Levine, Ph.D., who studies deception. Past research has found that people lie, on average, about once or twice per day.

However, researchers have learned that the average number of lies per day reported in the literature does not reflect the behavior of most people. The distribution of lying is highly skewed: Most people report telling few or no lies on a given day; and most lies are told by only “a few prolific liars,” the study’s authors determined.

Levine, along with co-authors Kim B. Serota of Rochester University in Michigan and Tony Docan-Morgan of the University of Wisconsin-La Crosse, are authors of the study “Unpacking variation in lie prevalence: Prolific liars, bad lie days, or both?” It was published in Communication Monographs, the flagship journal of the National Communication Association.

Levine, who is Distinguished Professor and chair of the Department of Communication Studies in the UAB College of Arts and Sciences, says the message in this paper is that everyday communication is probably safer than you think it is.

“That said, there are these few prolific liars out there,” Levine said. “And I think this study showed that they are a real thing. There is that kind of top 1 percent who are telling more than 15 lies per day, day in day out.”

Over the past decade, the skewed distribution of lie prevalence has emerged as an “exceptionally robust phenomenon,” Levine said. “People are mostly honest, except for a few pathological liars.”

Most previous research on lying looked at snapshots of deception at one point in time. This new groundbreaking research study involved so much work it may be one-of-a-kind, Levine says.

The study tracked people’s lying every day over three months. More than 630 participants were asked to keep a daily deception journal, yielding 116,336 lies.

The results showed about three-quarters of the study participants were consistently honest, telling between zero and two lies per day. By contrast, a small subset of people—about 6 percent—averaged more than six lies per day and accounted for a sizable proportion of the lies.

“It was a really hard study to pull off, where you get people to answer the survey every day for three months in a row,” Levine said. “And we find, yes, there are people who lie way more than the rest of us. Seventy-five percent of us are, really, pretty honest. There are a few people who are super-honest.”

People also have good and bad lie days, when they tell more or fewer lies than is typical for them, Levine says. People do not lie for the most part, he says, a few pathological liars aside. Also, for the most part, people do not lie unless they have a reason to. Our daily demands “are a big driver for most of us on how honest or dishonest we are,” Levine said.

This is more evidence that people are probably more honest than a lot of people might think, he says, even in the era of fake news and misinformation.

“People are mostly honest, and people mostly believe other people—and deception is more of a kind of exceptional thing rather than a chronic worry,” Levine said.



More information:
Kim B. Serota et al, Unpacking variation in lie prevalence: Prolific liars, bad lie days, or both?, Communication Monographs (2021). DOI: 10.1080/03637751.2021.1985153

Citation:
New research show

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Antarctic ozone hole is 13th largest on record and expected to persist into November

Hexbyte Glen Cove

Lt. Timothy Holland, NOAA Corps, emerges from a balloon assembly station with an ozonesonde attached to a weather balloon before releasing it over the South Pole. Credit: Josiah Horneman, Station Physician’s Assistant, Antarctic Support Contract

The 2021 Antarctic ozone hole reached its maximum area on October 7 and ranks 13th largest since 1979, scientists from NOAA and NASA reported today. This year’s ozone hole developed similarly to last year’s: A colder than usual Southern Hemisphere winter lead to a deep and larger-than-average hole that will likely persist into November or early December.

“This is a large ozone hole because of the colder than average 2021 stratospheric conditions, and without a Montreal Protocoloffsite link, it would have been much larger,” said Paul Newman, chief scientist for Earth Sciences at NASA’s Goddard Space Flight Center.

What we call the ozone hole is a thinning of the protective ozone layer in the stratosphere (the upper layer of Earth’s atmosphere) above Antarctica that begins every September. Chlorine and bromine derived from human-produced compounds are released from reactions on high-altitude polar clouds. The chemical reactions then begin to destroy the ozone layer as the sun rises in the Antarctic at the end of winter.

Size matters. But how do you measure a hole in the atmosphere?

NOAA and NASA researchers detect and measure the growth and break up of the ozone hole with satellite instruments aboard Aura, Suomi-NPP and NOAA-20 satellites.

This year, NASA satellite observations determined the ozone hole reached a maximum of 9.6 million square miles (24.8 million square kilometers)—roughly the size of North America—before beginning to shrink in mid-October. Colder-than- and strong winds in the stratosphere circling Antarctica contributed to the hole’s size.

This visualization depicts the ozone hole over Antarctica at its maximum extent on October 7. 2021. Scientists define the “ozone hole” as the area in which ozone levels are depleted below 220 Dobson Units (dark blue, marked with black triangle on color bar). Credit: Climate.gov

NOAA scientists at the South Pole Station record the ‘s thickness by releasing weather balloons carrying ozone-measuring instruments called ozonesondes that measure the varying ozone concentrations as the balloon rises into the stratosphere.

When the polar sun rises, NOAA scientists also make measurements with a Dobson Spectropherometer, an optical instrument that records the total amount of ozone between the surface and the edge of space known as the total column ozone value. This year, scientists recorded the lowest total-column ozone value of 102 Dobson Units on October 7, the 8th lowest since 1986. At altitudes between 8 and 13 miles (14 to 21 kilometers) ozone was nearly completely absent during the ozone hole’s maximum size.

This chart depicts the average extent of the Antarctic ozone hole during peak ozone depletion season from 1979 to present. Scientists said the 2021 ozone hole would have been 1.5 million square miles larger if atmospheric chlorine levels today were as high today as they were in the early 2000s. Credit: NASA

The good news

Though the 2021 Antarctic ozone hole is larger than average, it’s substantially smaller than ozone holes measured during the late 1990s and early 2000s.

The ozone hole is recovering due to the Montreal Protocol and subsequent amendments banning the release of harmful -depleting chemicals called chlorofluorocarbons or CFCs. If atmospheric chlorine levels from CFCs were as high today as they were in the early 2000s, this year’s would likely have been larger by about 1.5 million square miles (about four million square kilometers) under the same weather conditions.



Citation:
Antarctic ozone hole is 13th largest on record and expected to persist into November (2021, October 28)
retrieved 28 October 2021
from https://phys.org/news/2021-10-antarctic-ozone-hole-13th-largest.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Identifying old bones by scanning the collagen in them using mass spectrometry

Hexbyte Glen Cove

Figure 1. Examples of peptide mass fingerprints for O. rufus, P. cinereus, S. harrisii, T. aculeatus, T. cynocephalus and V. ursinus. Credit: DOI: 10.1098/rsos.211229

A team of researchers from Australia, Germany, the U.S. and Canada has found that it is possible to identify old bones by scanning the collagen they contain using mass spectrometry. In their paper published in the journal Royal Society Open Science, the group describes using the technique to identify extant monotreme and marsupial species on Australia’s Barrow Island.

Collagen is a fibrous protein typically found in in humans and other animals. Prior research has shown that it has a unique chemical signature between species. In this new effort, the researchers applied that knowledge to the study of bones found on Barrow Island.

As the researchers note, figuring out which animal left its bones behind is quite often difficult, particularly when they have become fragmented. In the past, zooarchaeologists have had to rely on the shape of the bones or teeth to identify them. To improve on these methods, the researchers studied the use of to identify collagen sources in their lab. Such efforts led to the development of a technique they call ZooMS (zooarchaeology by mass spectrometry.)

The technique involves collecting samples as small as a few milligrams, dissolving them with certain enzymes and then scanning the results with a mass spectrometer. Once scanned, the results are compared with other samples that have already been identified. During testing, the researchers identified 24 unique species of creatures living in Australia.

The researchers note that the technique is not perfect, particularly if there is in a collection of bones. They note that in some cases, the technique may only be able to go as far as identifying a genus for the bones under study. Still, the researchers were able to use the technique to identify one of turtle at a dig site on Barrow’s Island. Prior research had shown that the large collection of bones there were left on the island by indigenous divers in the 19th century who had been forced to dive for pearls by European masters. The bones were the remains of the animals the divers had captured to feed themselves.



More information:
Carli Peters et al, Species identification of Australian marsupials using collagen fingerprinting, Royal Society Open Science (2021). DOI: 10.1098/rsos.211229

© 2021 Science X Network

Citation:
Identifying old bones by scanning the collagen in them using mass spectrometry (2021, October 27)
retrieved 27 October 2021
from https://phys.org/news/2021-10-bones-scanning-collagen-mass-spectrometry.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is p

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —