Hexbyte Glen Cove Distant quasar J0439+1634 explored in X-rays

Hexbyte Glen Cove

The XMM-Newton EPIC image of J0439+1634, combined from the pn, MOS1, and MOS2 images, in the 0.2–0.5 keV (left), 0.5–2 keV (middle) and 2–10 keV (right) X-ray bands. Credit: Yang et al., 2021.

Using ESA’s XMM-Newton spacecraft, an international team of astronomers has conducted X-ray observations of the most distant known gravitationally lensed quasar—J0439+1634. Results of the study, published December 20 on the arXiv pre-print server, shed more light on the properties of this source.

Quasars, or quasi-stellar objects (QSOs), are extremely (AGN) containing supermassive central black holes with accretion disks. Their redshifts are measured from the strong spectral lines that dominate their visible and .

Astronomers are especially interested in studying (at higher than 5.0) as they are the most luminous and most distant compact objects in the observable universe. Spectra of such QSOs can be used to estimate the mass of supermassive black holes that constrain the evolution and formation models of quasars. Therefore, high-redshift quasars could serve as a powerful tool to probe the early universe.

At a redshift of 6.52, J0439+1634 the first known gravitationally lensed high-redshift quasar. Its high lensing magnification makes it an excellent target for the study of X-ray emission from a reionization-era QSO that is intrinsically less luminous.

J0439+1634 is also the so-called broad absorption line (BAL) quasar. In general, BAL quasars are assumed to be highly absorbed in the soft X-ray band and are generally X-ray weak in observations of low-redshift quasars. However, to date no such studies of high-redshift BAL QSOs have been carried out, due to their faint X-ray emission.

So a team of researchers led by Jinyi Yang of the University of Arizona decided to explore J0439+1634 with the European Photon Imaging Camera (EPIC) system onboard XMM-Newton. They investigated the X-ray properties of this source through spectral analysis and compared the results with other quasar populations.

J0439+1634 was identified as an X-ray source with XMM-Newton in the 0.5–10 keV band by all three EPIC cameras, while it was not detected in the 0.2–0.5 keV band. The EPIC spectra show that the quasar has a flat photon index—at a level of approximately 1.45.

The observations found that the optical-to-X-ray spectral slope of J0439+1634 is about −2.07 suggesting that this quasar is underluminous by a factor of 18 in X-rays, which is consistent with the behavior of BAL QSOs observed at lower redshift. The astronomers noted that their study marks the first time when an X-ray weak BAL quasar at a redshift of above 6.0 has been observed spectroscopically.

Furthermore, spectral fitting using an absorbed power-law model suggests a high intrinsic column density in the case of J0439+1634—more than 200 sextillion cm-2. This finding, according to the researchers, suggests that J0439+1634 is the first highly obscured quasar with X-ray spectroscopy in the reionization epoch. They assume that this source could be an intrinsically X-ray weak quasar.



More information:
Deep XMM-Newton Observations of an X-ray Weak, Broad Absorption Line Quasar at z=6.5, arXiv:2112.10785 [astro-ph.GA] arxiv.org/abs/2112.10785

© 2021 Science X Network

Citation:
Distant quasar J0439+1634 explored in X-rays (2021, December 29)
retrieved 29 December 2021
from https://phys.org/news/2021-12-distant-quasar-j04391634-explored-x-rays.html

This document is subject to copyright. Apart from any fair dealing for t

% %item_read_more_button%% Hexbyte Glen Cove Educational Blog Repost With Backlinks — #metaverse #vr #ar #wordpress

Hexbyte Glen Cove Two decades on, euro can’t shake reputation as price driver

Hexbyte Glen Cove

by Ali Bekhtaoui With Jean-Philippe Lacour In Frankfurt and Thomas Perroteau In Madrid

Studies show the switch to the euro did not cause the price increases that consumers have denounced.

Europeans have been living with euro banknotes and coins for 20 years now, but many still point the finger at the single currency for covertly driving up consumer prices—despite plenty of evidence to the contrary.

“The is a catastrophe, it’s catastrophic,” says Maria Napolitano, a 65-year-old Italian living in Frankfurt.

“With 100 deutschmarks, you could fill up your shopping trolley. Now, 100 euros aren’t enough to fill two bags.”

It’s an impression that is shared by many across the eurozone, from Paris to Rome, Madrid to Athens.

Victor Irun, a 53-year-old teacher in Madrid, said that for Spaniards the switch to the euro was “like entering a club for rich people while not wearing the right clothes.

“You had the impression we weren’t yet ready,” he told AFP. “It was as if we were living in Spain, but paying with French or Dutch money.”

Some Germans even coined a new nickname for the single currency—”teuro”, a pun on the German word for expensive, “teuer”.

Perception and data

In a 2006 study, university professor Hans Wolfgang Brachinger compiled an index of “perceived” among Germans, which measured a “felt” increase in consumer prices of seven percent between 2001 and 2002, compared with around two percent normally.

In reality, however, official statistics showed no pick-up in inflation in Europe’s biggest economy.

“There was a very strong feeling in several countries that prices had gone up, and yet we had official stats telling us otherwise,” said Giovanni Mastrobuoni, an economics professor at the University of Turin.

In a detailed study on the issue, he tracked the prices of different everyday goods in the eurozone—from fruit, vegetables and bread, to beverages and restaurant meals—and found that the prices of some inexpensive products really did increase as a result of the switch to the euro.

And as retailers tended to round up rather than down when converting the price into the single currency, it was hardly surprising that people noticed—and felt cheated.

“The things that you consume very often, food, coffee, things that don’t cost that much. Over time, they would shape your perceptions because these are the most frequent transactions,” Mastrobuoni said.

In France, the average price of a simple coffee in a bar rose from 1.19 euros to 1.22 euros between the end of 2001 and the beginning of 2002, according to the national statistics office, Insee.

Headline inflation unaffected

According to Mastrobuoni, the phenomenon is even more marked in countries where distribution sectors are less concentrated, as small retailers there are at greater liberty to raise their prices.

The European Commission calculates that the actual increase linked to the introduction of the euro was between 0.1 and 0.3 percent.

However, headline inflation was largely unaffected because the of higher-end goods did not increase and, in some cases, even declined thanks to improved productivity.

For all 12 of the countries that adopted the euro from the very beginning, the average annual rate of inflation stood at 2.3 percent in both 2001 and 2002, according to the EU’s Eurostat statistics agency.

However, the picture varied from country to country.

In Spain, inflation stood at 2.8 percent in 2001 and 3.6 percent in 2002, while most other countries recorded constant inflation or, in the case of Belgium, Germany, Luxembourg and the Netherlands, slower inflation.

Pierre Jaillet, a researcher at both the Jacques Delors and Iris institutes in France, said that consumers’ profiles played a significant role in whether they felt a discrepancy between real and perceived price developments.

“The average consumer price inflation basket corresponds to the average budget of an average urban white-collar worker,” Jaillet said.

People who are less well-off tend to spend a greater proportion of their income on food, so they will be squeezed more, he argued, noting that consumers generally remembered price increases, but not price reductions.



© 2021 AFP

Citation:
Two decades on, euro can’t shake reputation as price driver (2021, December 29)

% %item_read_more_button%% Hexbyte Glen Cove Educational Blog Repost With Backlinks — #metaverse #vr #ar #wordpress

Hexbyte Glen Cove Euro banknote artist fears redesign could revive rivalries

Hexbyte Glen Cove

Austrian artist Robert Kalina used neutral illustrations when designing the original euro banknotes.

Twenty years after arriving in Europeans’ wallets, euro banknotes will get a new look with help from the public, a process officials hope will make citizens feel closer to the single currency.

But the Austrian artist behind the original banknotes fears the redesign could spark national rivalries, something he painstakingly tried to avoid with neutral illustrations the first time around.

Now retired, Robert Kalina was working as a graphic designer for the Austrian National Bank when he won a competition in 1996 to create the artwork for the first-ever euro notes.

“It’s incredible to think that the euro is already 20 years old, I hope it stays around for a long time to come,” he told AFP.

Kalina’s designs were initially printed on 14.5 billion banknotes in denominations ranging from five to 500 euros.

The bills in circulation have since almost doubled in volume and found their way into the hands of some 350 million Europeans and many more people around the world.

A challenge

Euro coins, which are minted by euro members, have a shared image on one side and a country-specific one the other. Ireland for instance opted for a harp, France for a tree.

But euro banknotes are issued by the European Central Bank, and their designs had to be identical across the euro region and avoid “national bias”.

The challenge for Kalina was coming up with illustrations all Europeans could identify with, without stirring nationalist sentiments or appearing to favour one eurozone nation over another.

“Portraits might have been allowed, but only if the faces were anonymous. I excluded that option right away,” Kalina said.

He decided to focus on architecture.

Drawing on inspiration from existing buildings, Kalina simplified and reworked their depictions with the help of engineering experts, to ensure the structures “were no longer recognisable” but still believable.

His bridge designs, showcasing different historical styles in Europe, symbolise the connection between eurozone citizens, “but also between the European Union and the rest of the world”.

The windows and doorways on the other side of the notes stand for “openness and a vision of the future”.

Despite the numerous crises that have rocked the currency since its birth, Kalina says the ideals he sought to portray are “still valid”.

But earlier this month, the ECB said the bills were ready for a makeover, announcing a design and consultation process with a decision expected in 2024.

Jealousy?

“After 20 years, it’s time to review the look of our banknotes to make them more relatable to Europeans of all ages and backgrounds,” said ECB president Christine Lagarde.

Euro banknotes are “here to stay”, she said, although the ECB is also considering creating a digital euro in step with other central banks around the globe.

The ECB will rely on a 19-person panel of experts for the design—one from each nation—and consult the public along the way.

“The question is whether people have come far enough to accept, for example, famous people being represented”, even if they are linked to a particular country, Kalina said.

“Might it perhaps cause jealousy?” he asked, recalling heated debates on the issue in the 1990s.

The world of music might be a good place to look for inspiration for the next generation of notes, Kalina mused, since “great composers like Beethoven or Mozart can’t be reduced to a single country”.

Music “is a language that doesn’t require words and one that everyone can understand,” he said.



© 2021 AFP

Citation:
Euro banknote artist fears redesign could revive rivalries (2021, December 29)
retrieved 29 December 2021
from https://phys.org/news/20

% %item_read_more_button%% Hexbyte Glen Cove Educational Blog Repost With Backlinks — #metaverse #vr #ar #wordpress

Hexbyte Glen Cove Chilean scientists study climate change at ‘end of the world’

Hexbyte Glen Cove

A recent expedition on board the oceanographic research vessel Cabo de Hornos in the far south of Chile sought to investigate harmful organisms and how they are impacting climate change.

Chilean scientists studying organisms in one of the most remote places on Earth are urging regional leaders to step up efforts to tackle climate change.

A recent expedition, which was delayed by a year due to the coronavirus pandemic, sought to investigate harmful organisms and how they are impacting .

Chile’s Magallanes region—on the southern tip of South America where the Atlantic and Pacific oceans meet—is known as the “end of the world” and extends from Punta Arenas through the Magallanes Strait to the Beagle Channel.

Sailing through peak-lined straits past glaciers and soaring birds, the scientists on board the oceanographic research vessel Cabo de Hornos had their focus trained on the water, which has lower levels of acidity, salt and calcium than other seas and oceans, especially in their shallowest parts.

Scientists believe the conditions found in the water will appear in other parts of the world in the coming decades, as the impact of climate change mounts.

“The regional plans for mitigation and adaptation to climate change are out of date with respect to what is happening in the environment,” Jose Luis Iriarte, who headed the expedition, told AFP.

The expedition stopped at 14 places, each time taking water samples at different levels up to a depth of 200 meters using a piece of equipment called a rosette.

“The environment is changing quicker than we as a society are responding to it.”

The scientific mission paid special attention to the “red tides”— that can turn the sea red.

They were first recorded in the Magallanes region half a century ago and have since been responsible for the deaths of 23 people and poisoned more than 200.

This area is also affected by melting glaciers, a product of global warming.

“We don’t know how these organisms and particularly microorganisms will respond to these effects,” said Iriarte.

The expedition stopped at 14 places, each time taking at different levels up to a depth of 200 meters using a piece of equipment called a rosette.

Another piece of equipment was used to collect soil samples, sometimes at a depth of more than 300 meters.

The scientists also combed the shores for algae and molluscs.

Sailing through peak-lined straits past glaciers and soaring birds, the scientists on board the oceanographic research vessel Cabo de Hornos had their focus trained on the water, which has lower levels of acidity, salt and calcium.

Last bastion of biodiversity

From the highest point on the boat, marine biologist Rodrigo Hucke, one of 19 scientists on the expedition, spent hours scanning the surface of the water.

Spotting a far off whale, he would give the signal and then jump into a small motorboat to try to get as close as possible to the huge mammal in a bid to collect its faeces, with the aim of looking for changes to its diet.

Hucke says there has been a historical lack of action by governments when it comes to the oceans, which cover 70 percent of the planet’s surface.

He hopes the next UN Climate Change Conference—COP27 in Egypt—will mark a true global transformation in how the oceans are managed.

“All of this needs to change in 2022 and there needs to be a concrete decision in advancing toward profound policies of change in how us humans do things,” said Hucke.

He is worried that this region could one day become “one of the last bastions of biodiversity on Earth.”

Afer the nine-day mission, it was time to head back to laboratories to analyze the information gathered.

“I think we’re the voice of what nature cannot say,” said Wilson Castillo, a biochemistry student who, at 24, was the youngest member of the expedition.



© 2021 AFP

Citation:
Chilean scientists study climate change at ‘end of the world’ (2021, December 29)
retrieved 29 December 2021
from https://phys.org/news/2021-12-chilean-scientists-climate-world.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

% %item_read_more_button%% Hexbyte Glen Cove Educational Blog Repost With Backlinks — #metaverse #vr #ar #wordpress

Hexbyte Glen Cove Researchers develop structural blueprint of nanoparticles to target white blood cells responsible for lung inflammation

Hexbyte Glen Cove

Credit: CC0 Public Domain

The COVID-19 pandemic highlighted the devastating impact of acute lung inflammation (ALI), which is part of the acute respiratory distress syndrome (ARDS) that is the dominant cause of death in COVID-19. A potential new route to the diagnosis and treatment of ARDS comes from studying how neutrophils—the white blood cells responsible for detecting and eliminating harmful particles in the body—differentiate what materials to uptake by the material’s surface structure, and favor uptake of particles that exhibit “protein clumping,” according to new research from the Perelman School of Medicine at the University of Pennsylvania. The findings are published in Nature Nanotechnology.

Researchers investigated how neutrophils are able to differentiate between bacteria to be destroyed and other compounds in the bloodstream, such as cholesterol particles. They tested a library consisting of 23 different protein-based nanoparticles in mice with ALI which revealed a set of “rules” that predict uptake by neutrophils. Neutrophils don’t take up symmetrical, rigid particles, such as viruses, but they do take up particles that exhibited “protein clumping,” which the researchers call nanoparticles with agglutinated protein (NAPs).

“We want to utilize the existing function of neutrophils that identifies and eliminates invaders to inform how to design a ‘Trojan horse’ nanoparticle that overactive neutrophils will intake and deliver treatment to alleviate ALI and ARDS,” said study lead author Jacob Myerson, Ph.D., a postdoctoral research fellow in the Department of Systems Pharmacology and Translational Therapeutics. “In order to build this ‘Trojan horse’ delivery system, though, we had to determine how neutrophils identify which particles in the blood to take up.”

ALI and ARDS are life-threatening forms of respiratory failure with high morbidity and mortality rates. Prior to COVID-19, there were 190,000 annual cases of ARDS in the U.S. and 75,000 deaths, with the ARDS being caused by pneumonia, sepsis, and trauma. However, COVID has increased ARDS cases into the millions. When ALI or ARDS occurs, the lung’s air sacs recruit neutrophils to the lungs in order to eliminate circulating microbes. This process causes neutrophils to release compounds that further aggravate lung injury and damage the air sacs, so patients develop low blood oxygen levels. Unfortunately, despite the severity of ALI/ARDS, there is no effective drug to control it, and treatment currently focuses on supporting patients while the lungs naturally, but slowly, heal.

To address ARDS and other , researchers at Penn and elsewhere have been using nanoparticles to concentrate drugs in injured or diseased organs. Such nanoparticles are also being used for and immunotherapy.

The researchers note that while the development of viable therapies for ALI/ARDS using nanoparticles to deliver treatments via neutrophils are a long way off, this research represents a significant step in understanding the condition and function of the immune system.

“Now that we have determined that neutrophils patrol for with agglutinated protein, our next step is to understand how and why other microbes, like viruses, which are rigid and symmetrical, evolved to evade ,” said senior author Jacob Brenner, MD, Ph.D., an associate professor of Pulmonary Medicine in the Division of Pulmonary, Allergy, and Critical Care. “With this knowledge, we can continue to utilize this unique combination of material science and engineering, to create disease-specific therapies that target more advanced and complicated pathologies.”



More information:
Jacob W. Myerson et al, Supramolecular arrangement of protein in nanoparticle structures predicts nanoparticle tropism for neutrophils in acute lung inflammation, Nature Nanotechnology (2021). DOI: 10.1038/s41565-021-00997-y

Citation:
Researchers develop structural blueprint of nanoparticles to target white blood cells responsible for lung inflammation (

% %item_read_more_button%% Hexbyte Glen Cove Educational Blog Repost With Backlinks — #metaverse #vr #ar #wordpress

Hexbyte Glen Cove Team proposes ‘nano-chocolates’ as a new way to store hydrogen

Hexbyte Glen Cove

The palladium nanoparticles (green) are stabilized by a core of iridium (red). Hydrogen can accumulate on their surface like a kind of chocolate glaze—and can be released again by heating. Credit: DESY, Andreas Stierle

An innovative approach could turn nanoparticles into simple reservoirs for storing hydrogen. The highly volatile gas is considered a promising energy carrier for the future, which could provide climate-friendly fuels for airplanes, ships and trucks, for example, as well as allowing climate-friendly steel and cement production—depending on how the hydrogen gas is generated. However, storing hydrogen is costly: Either the gas has to be kept in pressurized tanks, at up to 700 bar, or it must be liquified, which means cooling it down to minus 253 degrees Celsius. Both procedures consume additional energy.

A team led by Deutsches Elektronen-Synchrotron (DESY)’s Andreas Stierle has laid the foundations for an alternative method: Storing in tiny nanoparticles made of the precious metal , just 1.2 nanometers in diameter. The fact that palladium can absorb hydrogen like a sponge has been known for some time. “However, until now getting the hydrogen out of the material again has posed a problem,” Stierle explains. “That’s why we are trying palladium particles that are only about one nanometer across.” A nanometer is a millionth of a millimeter.

To ensure that the are sufficiently sturdy, they are stabilized by a core made of the rare precious metal iridium. In addition, they are attached to a graphene support, an extremely thin layer of carbon. “We are able to attach the palladium particles to the graphene at intervals of just two and a half nanometers,” reports Stierle, who is the head of the DESY NanoLab. “This results in a regular, periodic structure.” The team, which also includes researchers from the Universities of Cologne and Hamburg, published its findings in the American Chemical Society (ACS) journal ACS Nano.

DESY’s X-ray source PETRA III was used to observe what happens when the palladium particles come into contact with hydrogen: Essentially, the hydrogen sticks to the nanoparticles’ surfaces, with hardly any of it penetrating inside. The nanoparticles can be pictured as resembling chocolates: An iridium nut at the center, enveloped in a layer of palladium, rather than marzipan, and chocolate-coated on the outside by the hydrogen. All it takes to recover the stored hydrogen is for a small amount of heat to be added; the hydrogen is rapidly released from the surface of the particles, because the gas molecules don’t have to push their way out from inside the cluster.

“Next, we want to find out what storage densities can be achieved using this new method,” says Stierle. However, some challenges still need to be overcome before proceeding to practical applications. For example, other forms of carbon structures might be a more suitable carrier than graphene—the experts are considering using carbon sponges, containing tiny pores. Substantial amounts of the palladium nanoparticles should fit inside these.

The latest issue of DESY’s research magazine, femto, looks into this and other innovative concepts for the hydrogen economy and a sustainable energy supply. The magazine explains how fundamental research can contribute to innovations for the energy transition. This is not only about using hydrogen as an energy carrier, but also about sustainable solar cells and novel forms of energy generation, as well as achieving greater energy efficiencies in research itself, when operating large particle accelerators, for instance.



More information:
Dirk Franz et al, Hydrogen Solubility and Atomic Structure of Graphene Supported Pd Nanoclusters, ACS Nano (2021). DOI: 10.1021/acsnano.1c01997

Citation:
Team proposes ‘nano-chocolates’ as a new way to store hydrogen (2021, December 27)
retrieved 28 December 2021
from https://phys.org/news/2021-12-team-nano-chocolates-hydrogen.html

% %item_read_more_button%% Hexbyte Glen Cove Educational Blog Repost With Backlinks — #metaverse #vr #ar #wordpress

Hexbyte Glen Cove Scientists build new atlas of ocean’s oxygen-starved waters

Hexbyte Glen Cove

Oxygen deficient zone intensity across the eastern Pacific Ocean, where copper colors represent the locations of consistently lowest oxygen concentrations and deep teal indicates regions without sufficiently low dissolved oxygen. Credits: Jarek Kwiecinski and Andrew Babbin

Life is teeming nearly everywhere in the oceans, except in certain pockets where oxygen naturally plummets and waters become unlivable for most aerobic organisms. These desolate pools are “oxygen-deficient zones,” or ODZs. And though they make up less than 1 percent of the ocean’s total volume, they are a significant source of nitrous oxide, a potent greenhouse gas. Their boundaries can also limit the extent of fisheries and marine ecosystems.

Now MIT scientists have generated the most detailed, three-dimensional “atlas” of the largest ODZs in the world. The new atlas provides high-resolution maps of the two major, oxygen-starved bodies of in the tropical Pacific. These maps reveal the volume, extent, and varying depths of each ODZ, along with fine-scale features, such as ribbons of oxygenated water that intrude into otherwise depleted zones.

The team used a new method to process over 40 years’ worth of ocean data, comprising nearly 15 million measurements taken by many research cruises and autonomous robots deployed across the tropical Pacific. The researchers compiled then analyzed this vast and fine-grained data to generate maps of oxygen-deficient zones at various depths, similar to the many slices of a three-dimensional scan.

From these maps, the researchers estimated the total volume of the two major ODZs in the tropical Pacific, more precisely than previous efforts. The first zone, which stretches out from the coast of South America, measures about 600,000 cubic kilometers—roughly the volume of water that would fill 240 billion Olympic-sized pools. The second zone, off the coast of Central America, is roughly three times larger.

The atlas serves as a reference for where ODZs lie today. The team hopes scientists can add to this atlas with continued measurements, to better track changes in these zones and predict how they may shift as the climate warms.

“It’s broadly expected that the oceans will lose oxygen as the climate gets warmer. But the situation is more complicated in the tropics where there are large oxygen-deficient zones,” says Jarek Kwiecinski ’21, who developed the atlas along with Andrew Babbin, the Cecil and Ida Green Career Development Professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “It’s important to create a detailed map of these zones so we have a point of comparison for future change.”

The team’s study appears today in the journal Global Biogeochemical Cycles.

Airing out artifacts

Oxygen-deficient zones are large, persistent regions of the ocean that occur naturally, as a consequence of marine microbes gobbling up sinking phytoplankton along with all the available oxygen in the surroundings. These zones happen to lie in regions that miss passing ocean currents, which would normally replenish regions with oxygenated water. As a result, ODZs are locations of relatively permanent, oxygen-depleted waters, and can exist at mid-ocean depths of between roughly 35 to 1,000 meters below the surface. For some perspective, the oceans on average run about 4,000 meters deep.

Over the last 40 years, research cruises have explored these regions by dropping bottles down to various depths and hauling up seawater that scientists then measure for oxygen.

“But there are a lot of artifacts that come from a bottle measurement when you’re trying to measure truly zero oxygen,” Babbin says. “All the plastic that we deploy at depth is full of oxygen that can leach out into the sample. When all is said and done, that artificial oxygen inflates the ocean’s true value.”

Rather than rely on measurements from bottle samples, the team looked at data from sensors attached to the outside of the bottles or integrated with robotic platforms that can change their buoyancy to measure water at different depths. These sensors measure a variety of signals, including changes in electrical currents or the intensity of light emitted by a photosensitive dye to estimate the amount of oxygen dissolved in water. In contrast to seawater samples that represent a single discrete depth, the sensors record signals continuously as they descend through the water column.

Scientists have attempted to use these sensor data to estimate the true value of oxygen concentrations in ODZs, but have found it incredibly tricky to convert these signals accurately, particularly at concentrations approaching zero.

“We took a very different approach, using measurements not to look at their true value, but rather how that value changes within the water column,” Kwiecinski says. “That way we can identify anoxic waters, regardless of what a specific sensor says.”

Bottoming out

The team reasoned that, if sensors showed a constant, unchanging value of oxygen in a continuous, vertical section of the ocean, regardless of the true value, then it would likely be a sign that oxygen had bottomed out, and that the section was part of an oxygen-deficient zone.

The researchers brought together nearly 15 million sensor measurements collected over 40 years by various research cruises and robotic floats, and mapped the regions where oxygen did not change with depth.

“We can now see how the distribution of anoxic water in the Pacific changes in three dimensions,” Babbin says.

The team mapped the boundaries, volume, and shape of two major ODZs in the tropical Pacific, one in the Northern Hemisphere, and the other in the Southern Hemisphere. They were also able to see fine details within each zone. For instance, oxygen-depleted waters are “thicker,” or more concentrated towards the middle, and appear to thin out toward the edges of each zone.

“We could also see gaps, where it looks like big bites were taken out of anoxic waters at shallow depths,” Babbin says. “There’s some mechanism bringing oxygen into this region, making it oxygenated compared to the water around it.”

Such observations of the tropical Pacific’s oxygen-deficient zones are more detailed than what’s been measured to date.

“How the borders of these ODZs are shaped, and how far they extend, could not be previously resolved,” Babbin says. “Now we have a better idea of how these two zones compare in terms of areal extent and depth.”

“This gives you a sketch of what could be happening,” Kwiecinski says. “There’s a lot more one can do with this data compilation to understand how the ocean’s supply is controlled.”



This story is republished courtesy of MIT News (web.mit.edu/newsoffice/), a popular site that covers news about MIT researc

% %item_read_more_button%% Hexbyte Glen Cove Educational Blog Repost With Backlinks — #metaverse #vr #ar #wordpress

Hexbyte Glen Cove How DNA is preserved in archaeological sediments for thousands of years

Hexbyte Glen Cove

Sampling of an undisturbed block of impregnated sediment for ancient DNA analyses. Credit: MPI f. Evolutionary Anthropology

Sediments in which archaeological finds are embedded have long been regarded by most archaeologists as unimportant by-products of excavations. However, in recent years it has been shown that sediments can contain ancient biomolecules, including DNA. “The retrieval of ancient human and faunal DNA from sediments offers exciting new opportunities to investigate the geographical and temporal distribution of ancient humans and other organisms at sites where their skeletal remains are rare or absent,” says Matthias Meyer, senior author of the study and researcher at the Max Planck Institute for Evolutionary Anthropology in Leipzig.

To investigate the origin of DNA in the sediment, Max Planck researchers teamed up with an international group of geoarchaeologists—archaeologists who apply geological techniques to reconstruct the formation of sediment and sites—to study DNA preservation in sediment at a microscopic scale. They used undisturbed blocks of sediment that had been previously removed from archaeological sites and soaked in synthetic plastic-like (polyester) resin. The hardened blocks were taken to the laboratory and sliced in sections for microscopic imaging and genetic analysis.

The researchers successfully extracted DNA from a collection of blocks of sediment prepared as long as 40 years ago, from sites in Africa, Asia, Europe and North America. “The fact that these blocks are an excellent source of ancient DNA—including that originating from hominins—despite often decades of storage in plastic, provides access to a vast untapped repository of genetic information. The study opens up a new era of ancient DNA studies that will revisit samples stored in labs, allowing for analysis of sites that have long since been back-filled, which is especially important given travel restriction and site inaccessibility in a pandemic world,” says Mike Morley from Flinders University in Australia who led some of the geoarchaeological analyses.

Surface of a section of undisturbed block of impregnated sediment from Denisova Cave. Credit: Mike Morley

Abundance of micro remains in the sediment matrix

The scientists used blocks of sediment from Denisova Cave, a site located in the Altai Mountains in South Central Siberia where ancient DNA from Neanderthals, Denisovans and has been retrieved, and showed that small organic particles yielded more DNA than sediment sampled randomly. “It clearly shows that the high success rate of ancient mammalian DNA retrieval from Denisova Cave sediments comes from the abundance of micro remains in the sediment matrix rather than from free extracellular DNA from feces, bodily fluids or decomposing cellular tissue potentially adsorbed onto mineral grains,” says Vera Aldeias, co-author of the study and researcher at the University of Algarve in Portugal. “This study is a big step closer to understand precisely where and under what conditions ancient DNA is preserved in sediments,” says Morley.

The approach described in the study allows highly localized micro-scale sampling of sediment for DNA analyses and shows that ancient DNA (aDNA) is not uniformly distributed in the sediment; and that specific sediment features are more conducive to ancient DNA preservation than others. “Linking sediment aDNA to the archaeological micro-context means that we can also address the possibility of physical movement of aDNA between sedimentary deposits,” says Susan Mentzer a researcher at the Senckenberg Centre for Human Evolution and Palaeoenvironment (Germany).

Diyendo Massilani, the lead author of the study, was able to recover substantial amounts of Neanderthal DNA from only a few milligrams of sediment. He could identify the sex of the individuals who left their DNA behind, and showed that they belonged to a population related to a Neanderthal whose genome was previously reconstructed from a bone fragment discovered in the cave. “The Neanderthal DNA in these small samples of plastic-embedded sediment was far more concentrated than what we typically find in loose material,” he says. “With this approach it will become possible in the future to analyze the DNA of many different ancient human individuals from just a small cube of solidified . It is amusing to think that this is presumably so because they used the cave as a toilet tens of thousands of years ago.”

The research was published in Proceedings of the National Academy of Sciences.



More information:
Microstratigraphic preservation of ancient faunal and hominin DNA in Pleistocene cave sediments, Proceedings of the National Academy of Sciences, DOI: 10.1073/pnas.2113666118

Citation:
How DNA is preserved in archaeological sediments for thousands of years (2021, December 27)
retrieved 28 December 2021
from https://phys.org/news/2021-12-dna-archaeological-sediments-thousands-years.html

% %item_read_more_button%% Hexbyte Glen Cove Educational Blog Repost With Backlinks — #metaverse #vr #ar #wordpress