Hexbyte Glen Cove Coconut tree cloning breakthrough will help propagation and preservation thumbnail

Hexbyte Glen Cove Coconut tree cloning breakthrough will help propagation and preservation

Hexbyte Glen Cove

Coconut trees grow slowly and are difficult to clone. Scientists at KU Leuven and the Alliance multiplied seedlings faster and conserved coconut genetic resources for the long term. Credit: KU Leuven – Hannes Wilms

Coconut trees grow slowly and are difficult to clone. Scientists at KU Leuven and the Alliance multiplied seedlings faster and conserved coconut genetic resources for the long term. This will help to preserve coconut tree biodiversity and meet the increasing demand for coconuts and derived products. 

The is the sixth most cultivated fruit on earth and demand for it continues to rise. In recent years, coconut oil has seen an increase in consumer demand. Coconut water is also used as a natural alternative to sugary soft drinks. But the coconut tree, which mainly grows in around the equator, faces many challenges: the Lethal Yellowing Disease, climate change, rising sea levels and outdated plantations. This is why researchers at KU Leuven and the Alliance of Bioversity International and CIAT have developed a method to multiply coconut faster and store them more efficiently in gene banks.

“Nobody thought we could do it”

Bart Panis of the Laboratory for Tropical Crop Improvement (KU Leuven) and the Alliance and Ph.D. student Hannes Wilms drew their inspiration for this research from another fruit variety: the banana. From his work on , Panis suspected that a certain could also be successful in coconut trees. “Coconut plants do not form side shoots. They put all their energy into one shoot that has to grow as fast and as tall as possible. This makes it very difficult to clone and store the ,” Panis said. Not anymore.

For their research, the scientists first extracted the coconut tree embryo from the coconut. Then they applied the plant hormone to the meristem—or growing point—contained in the embryo. In this way, they succeeded in having the embryo form not only one shoot, but several side shoots. They managed, in turn, to split these shoot clusters and allow new side shoots to grow on them as well. Their findings were published on 15 September in Scientific Reports.

“Nobody thought we could do it. But we persevered with determination,” Panis said. “There are other methods of vegetatively propagating coconut trees but we are confident that this is the first method that is also time- and cost-efficient.” Thousands of new specimens of a single coconut plant can be obtained that share the mother plant’s exact same genetic profile. This offers enormous potential for coconut plantations worldwide.

Challenges for the coconut

First and foremost, the team wants to safeguard the coconut tree’s genetic diversity as efficiently as possible. It is very crucial to preserve as many varieties of the coconut plant because each variety has specific characteristics. Some are resistant to a particular disease or have a better oil composition, while others are more resistant to heat, drought or storms.

“Currently, the is kept as a tree in field collections. But some of these collections are threatened by Lethal Yellowing Disease,” said Hannes Wilms, a co-author from KU Leuven. “Our technique now also allows the shoots of the coconut plant to be preserved for eternity by cryopreservation, in liquid nitrogen at a temperature of -196°C.”

“This is important for the future: if a new disease was to affect coconut production, there may be a variety in the gene bank that is resistant to this disease and can be planted out in the affected areas,” said Panis.

Small farmer, fair price

For today’s coconut production, too, the technique comes at the right time. “There’s a huge demand for coconut. The existing plantations are old and need to be replanted in the short term,” Panis said. “So our technique meets the great demand for healthy plant material.”

The researchers have now submitted a . All that is needed now is additional funds to protect the patent and further refine the technique. In this process, researchers will not lose sight of small-scale producers. “Because coconut production is often in the hands of small farmers, we will have a special clause included in the patent license: whoever applies our technique will have to ensure that small farmers can purchase the plant material at a reasonable price,” Panis said



More information:
Hannes Wilms et al, Development of the first axillary in vitro shoot multiplication protocol for coconut palms, Scientific Reports (2021). DOI: 10.1038/s41598-021-97718-1

Citation:
Coconut tree cloning breakthrough will help propagation and preservation (2021, September 15)
retrieved 16 September 2021
from https://phys.org/news/2021-09-coconut-tree-cloning-breakthrough-propagation.html

This document is subject to copyright. Apart from any fair deali

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Climate change, logging collide—and a forest shrinks thumbnail

Hexbyte Glen Cove Climate change, logging collide—and a forest shrinks

Hexbyte Glen Cove

Credit: Unsplash/CC0 Public Domain

Looking down a hillside dotted with large stumps and nearly devoid of trees, a pair of retired U.S. Forest Service employees lamented logging policies they helped craft to deal with two harbingers of climate change—pine beetles and wildfires.

Timber production dramatically ramped up two decades ago in the Black Hills National Forest along the South Dakota-Wyoming border, as beetles ravaged huge expanses of forest and worries grew over wildfires.

The beetles left, but the loggers haven’t—and they’re now felling trees at twice the rate government scientists say is sustainable. That means the Black Hills forests are shrinking, with fewer and smaller trees.

Timber sales from federal forests nationwide more than doubled over the past 20 years, according to government data. In Washington, D.C., Republicans and Democrats alike have pushed more aggressive thinning of stands to reduce vegetation that fuels wildfires.

But critics of federal forest management say that in their fervor to do something about climate change, officials are allowing the removal of too many older trees that can actually better withstand fire.

In the Black Hills, stands of century-old ponderosa pines were thinned over the past two decades, then thinned again. In some areas, most of the remaining older and larger trees are being cut, leaving hillsides almost bare.

“Eventually you’re not going to have any big trees on the whole forest,” said Dave Mertz, who worked as a government natural resources officer overseeing Black Hills logging until retiring in 2017. “The timber industry is pulling the strings now. The Forest Service has lost its way.”

DIRE PREDICTIONS

Across the western U.S., more trees have been dying as climate change dramatically alters the landscape and leaves forests more susceptible. Wildfires, insects and disease are the top killers, researchers say.

A sweeping government review of forest health surveys since 1993 found that the rate of trees dying increased this century and outpaced new growth in all eight states examined—Arizona, Colorado, Idaho, Montana, Nevada, New Mexico, Utah and Wyoming. Timber harvested from Forest Service lands over the past two decades also increased.

In the Black Hills, those two trends have collided. With more trees being logged and even more killed by beetles and fires in recent years, government scientists say the forest can’t grow fast enough to keep up.

The timber industry and allies in Congress are pushing back against that conclusion. Timber company representatives predict dire economic consequences if forest managers sharply reduce harvest levels. And they say wildfires and beetle outbreaks would get worse.

One of the region’s seven mills closed in March, eliminating 120 jobs in Hill City, South Dakota. Owner Neiman Enterprises said a recent slowdown in timber sales meant it wouldn’t have enough logs.

“These companies aren’t tech startups. They are multi-generational family companies that want to be there for the long term.” said Ben Wudtke, director of the Black Hills Forest Resource Association of saw mills and logging companies.

FIGHTING FIRE

To counter growing havoc from western wildfires, Biden’s administration wants to double the forest acreage thinned or treated with prescribed burns to 6 million acres (2.4 million hectares) annually—bigger than New Hampshire.

One method to reduce fire risk is to remove dense stands of small trees and thick underbrush that accumulated for decades as wildfires—a natural part of the landscape—were suppressed.

It’s expensive, labor-intensive work, and there’s little market value in small trees. When sworn in this summer, Forest Service Chief Randy Moore said combating climate change will require making it worthwhile to harvest smaller trees, such as using the vegetation as biomass to generate electricity.

“It doesn’t pay for itself and we don’t have markets that seem to be increasing quickly enough,” he said.

The service’s former deputy chief, Jim Furnish, criticized the agency as too focused on timber production and too slow to react to climate change, to the detriment of the forest.

There are signs of change under President Joe Biden, including the administration’s move last month to end large-scale commercial logging of old-growth trees in Alaska’s Tongass National Forest.

But other projects that include old-growth removal are pending, including in Montana’s Kootenai National Forest along the Canada border, the Kaibab National Forest just north of the Grand Canyon in Arizona and Idaho’s Nez Perce-Clearwater National Forest.

“The Forest Service’s approach to date has been to attack this as a management problem: ‘We need to cut more trees,'” Furnish told The Associated Press. “You can’t cut your way out of this problem.”

Moore, the agency’s chief, acknowledged the warming planet was forcing changes, but said he hoped to find a “sweet spot” between the environment and industry—while removing enough vegetation to reduce wildfire risk. In the Black Hills, officials said they would consider the latest science alongside economic impacts as they seek to make logging sustainable.

“We need the industry to help us,” Moore said, referring to . “It’s not really about timber sales or cutting large trees.”

“BEAT TO HELL”

The Black Hills played an outsized role in the early formation of the nation’s timber policies. In the 1890s, excessive logging to feed demand for timbers for a nearby gold mine helped spur creation of the national forest system. The first regulated logging sales in forest service history took place there in 1899.

When artist and environmentalist Mary Zimmerman bought property within the Black Hills in 1988, neighboring public lands where that first timber sale took place had regrown so successfully that huge branches overhead “were like a cathedral.”

The site was thinned in 1990, removing some big trees but leaving many. It was thinned more in 2016. Then logging crews returned last year and took out the remaining big trees. Cattle now graze the area.

“It’s just beat to hell,” said Zimmerman.

Her account was confirmed by Blaine Cook, forest management scientist for the Black Hills for more than two decades until his 2019 retirement.

EARLY WARNINGS

Cook said his monitoring began to show last decade that the forest’s growth rate wasn’t keeping up with aggressive logging that was a response to the pine beetle outbreak that began in 1998. The high harvest rate continued after the outbreak peaked in 2012 and even after it ended in 2017.

Cook said his warnings that the forest was being damaged were rejected by superiors who faced political pressure to provide a steady supply of logs to sawmills in South Dakota and Wyoming.

Disagreement within the agency over whether there was too much logging culminated in a report this April by scientists from the forest service’s research branch that was unequivocal: Black Hills logging needs to be cut back by at least half, possibly more, to be sustainable.

The problem is that the forest changed but logging rates have not, said Mike Battaglia, one of the lead authors.

“In the late 90’s, you had twice as much volume” of trees in the forest, he said. “To take out the same amount now, you’re taking too much.”

Forest industry representatives criticized the government’s multi-year study for including only parts of the forest, saying that created an incomplete picture of how many are available to harvest.

They estimated up to 80% of the region’s jobs would be lost if the reduced logging to recommended levels. If that happens, they said the agency would have difficulty finding companies willing to do less profitable thinning work for wildfire protection.

“You have to have somebody around to do it,” said the industry’s Wudtke. “It’s really critical that we keep these companies going.”



Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Gd-doped nanoclusters help imaging of early orthotopic cancer thumbnail

Hexbyte Glen Cove Gd-doped nanoclusters help imaging of early orthotopic cancer

Hexbyte Glen Cove

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Rethink 'cost-benefit analysis' to tackle climate crisis thumbnail

Hexbyte Glen Cove Rethink ‘cost-benefit analysis’ to tackle climate crisis

Hexbyte Glen Cove

Credit: Pixabay/CC0 Public Domain

In a new paper, a group of leading researchers and policy experts argue that improving and enriching existing policy analysis methods – including costs and benefits among multiple other factors such as uncertainty, resilience and a better understanding of innovation – would lead to better decisions.

It addresses the recent amendment of the UK’s guidelines for policy analysis, which identifies the need for special treatment of policies that aim to drive systemic “transformational change,” including .

The paper’s authors say “inadequacies” in the way policies are devised might be hindering global climate action.

Ahead of the vital COP26 UN climate change conference in Glasgow later this year, they offer improved principles for policymaking during times of dynamic and transformational change.

The paper comes from the Economics of Energy Innovation and System Transition (EEIST) project, led by the University of Exeter.

“Calculations of the economic costs and benefits of policies, although they are considered alongside other considerations, have substantial influence on decisions,” said EEIST director Dr Jean-Francois Mercure, of Exeter’s Global Systems Institute.

“In periods of rapid change – like now – it’s extremely difficult to accurately estimate these costs and benefits, especially far in the future.”

“We don’t have enough certainty about the future to make sufficiently reliable predictions, so we need to consider how to use uncertainty to our advantage.”

“This is what our framework offers.”

As well as switching the focus away from an excessive reliance on costs, benefits and economic valuation, towards evaluating risks, opportunities and resilience, the new framework:

  • considers multiple interacting factors, acknowledging that changes will need to be made in light of changing circumstances.
  • analyzes processes of transformation instead of predicting outcomes at a moment in time.

Co-author Simon Sharpe, policy lead for COP26 at the UK Government’s Cabinet Office, said: “Policymaking on issues such as involves fundamental uncertainty, widely differing interests and the potential for structural change in the economy.”

“We make better decisions when these factors are the focus of our analysis, not assumed away or left on the sidelines.”

In one example of “static” hindering global climate action, such analysis suggested that replacing coal with gas would be the cheapest way to reduce .

However, this ignored the dynamic “feedbacks” (self-reinforcing chain reactions) that through a complex process of cumulative innovation and industrial development, eventually drove renewables to become the cheapest form of electricity generation.

During the COVID pandemic, governments have been forced to react quickly to rapidly changing situations, and this may offer hope for more agile policy in future.

The EEIST research team will further develop the framework and analyze its benefits for policymakers as they respond to the accelerating climate crisis.

The paper, published in the journal Global Environmental Change, is entitled: “Risk-opportunity analysis for transformative policy design and appraisal.”



More information:
Jean-Francois Mercure et al, Risk-opportunity analysis for transformative policy design and appraisal, Global Environmental Change (2021). DOI: 10.1016/j.gloenvcha.2021.102359

Citation:
Rethink ‘cost-benefit analysis’ to tackle climate crisis (2021, September 14)
retrieved 15 September 2021
from https://phys.org/news/2021-09-rethink-cost-benefit-analysis-tackle-climate.html

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Scientists find eco-friendly way to dye blue jeans thumbnail

Hexbyte Glen Cove Scientists find eco-friendly way to dye blue jeans

Hexbyte Glen Cove

The process of dyeing jeans is one of the top sources of pollution within the fashion industry. New research found a more eco-friendly alternative. Credit: Andrew Davis Tucker/UGA

Flared or skinny, distressed or acid-washed—we all love our blue jeans. But those coveted pieces of denim are wreaking havoc on the environment.

That’s why researchers from the University of Georgia developed a new indigo dyeing technology that’s kinder on the planet. The new technique reduces water usage and eliminates the toxic chemicals that make the dyeing process so environmentally damaging. And to top it off, the technology streamlines the process and secures more color than traditional methods.

“The textile industry is a classic example of an environmental polluter, and one of the major causes of pollution in the industry is coloration,” said Sergiy Minko, corresponding author of the study and the Georgia Power Professor of Fiber and Polymer Science in the College of Family and Consumer Sciences. Originally, natural indigo was used to dye textiles. Introduced to the Colonies in the 1700s, indigo was an important cash crop for early America. But the discovery of a way to produce synthetic indigo almost entirely wiped the natural indigo market off the map.

Indigo isn’t water soluble, though, and has to be reduced with toxic chemicals prior to using it to dye clothing. The denim industry uses more than 45,000 tons of synthetic indigo a year, along with over 84,000 tons of sodium hydrosulfite as reducing agent and 53,000 tons of lye.

Eliminating toxic chemicals

It takes between 50 to 100 liters of water to dye just one pair of jeans. And that water, chock-full of toxic chemicals, has to go somewhere. Although there are now regulations in place requiring U.S. plants to somewhat purify that wastewater, the industry simply released it into the environment for decades, corroding sewage pipes on its way to rivers and the ocean. Even now, chemical-contaminated water inevitably winds up in local waterways, particularly in the industrial factories in developing nations where production is frequently outsourced.

Published in the journal Green Chemistry, the new method of dyeing uses natural indigo (though the streamlined process could also use synthetic) and completely eliminates the use of harmful chemicals used in conventional methods. It also requires only one coat of the indigo to secure over 90% of the color, significantly reducing the amount of water needed to dye the fabric. Conventional methods require up to eight dips in dye solution and secure only 70% to 80%.

The new method doesn’t sacrifice comfort either, keeping around the same levels of thickness, weight gain and flexibility in the fabric. Because of the streamlined process, it saves workers time and energy by eliminating the need for multiple dips and oxidation time between each dip.

“You don’t reduce the indigo in this process; you don’t dissolve it,” Minko said. “You simply mix it with nanocellulose fibrils and deposit it on the surface of the textile. And you can change the shade of by the amount of indigo particles added in the mixture.”

Nanocellulose is a relatively recent creation that consist of wood pulp you’d normally see used in the paper industry. The new technology mixes indigo particles with the nanofibers and then deposits them on the surface of the textile, essentially “gluing” the color in place.

Increasing industry sustainability

The textile industry has long been known as one of the world’s most significant sources of pollution. And in the 20th century, when the world’s population skyrocketed, so did the need for mass-produced textiles, making the already bad pollution problem even worse.

Levi Strauss patented the first pair of jeans in 1873, and the practical pant soon became a fashion mainstay. Billions of pairs are now sold each year, comprising a market valued at over $90 billion.

Although the new technology still needs to be commercialized, it’s a viable option for making the denim industry more sustainable.

“Denim and jeans manufacturing are a big market, so even small changes in the industry could have huge impacts,” Minko said. “There are populations that are looking for products that are made in environmentally friendly ways. And as regulations become tougher, the industry will have to adapt.”

This research is a part of Ph.D. study of Smriti Rai, doctoral candidate in the department of Department of Textiles, Merchandising and Interiors. Other co-authors of the paper include the following from UGA’s Department of Textiles, Merchandising and Interiors: Raha Saremi, materials researcher and lab manager; and Suraj Sharma, professor.



More information:
Smriti Rai et al, Environment-friendly nanocellulose-indigo dyeing of textiles, Green Chemistry (2021). DOI: 10.1039/D1GC02043A

Citation:

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove The viruses of the North Sea thumbnail

Hexbyte Glen Cove The viruses of the North Sea

Hexbyte Glen Cove

Helgoland is Germany’s only true offshore island, known more for seabirds, seals and duty-free shopping than for viruses. However, these were precisely the focus of the MPI researchers. Credit: Max Planck Institute for Marine Microbiology, Naomi Esken

A new study provides exciting insights into the life of marine viruses in the North Sea during the spring bloom. Off the offshore island of Helgoland, researchers led by Nina Bartlau from the Max Planck Institute for Marine Microbiology found a dynamic viral community that can strongly influence the mortality of North Sea bacteria and thereby the carbon cycle of this habitat. They also discovered and isolated numerous new virus species.

On average, hundreds of thousands of tiny algae and a billion bacteria live in every liter of North Sea water. As if that were not enough, there are also ten billion viruses in every liter. They primarily infect bacteria and have a large and diverse impact on their environment, for example by killing infected cells and causing them to decay, or by altering their gene expression or genetic material. Now, a study provides new insights into the life of these viruses.

“Despite their importance, viruses are rarely a focus of marine research,” says first author Nina Bartlau from the Max Planck Institute for Marine Microbiology. “To my knowledge, we hereby provide the first study dealing with viruses of Flavobacteria, which are the most common bacteria in spring blooms, in the North Sea.”

Many new viruses discovered

The researchers found many new and very diverse phages in the North Sea. Phages are viruses that infect bacteria. Accordingly, the viruses studied here, which are specialized to infect Flavobacteria, are called flavophages. “The abundance and types of phages changed during the spring bloom. For example, at the beginning of the bloom we found only a few phages, which increased in abundance over time. We were also able to show that a particular phage species was only present for a short period of time, after which only its relatives remained,” explains co-author Cristina Moraru from the Institute of Chemistry and Biology of the Marine Environment at the University of Oldenburg. Over the years, however, the community of viruses was quite stable: Different phages could be found and isolated in two consecutive years.

Bacteriophages – viruses that infect bacteria – usually have a so-called head and a tail from which small spikes protrude like little legs. The viruses called Peternella in this electron microscope image are about 150 nanometers long, i.e. about 500 times smaller than the thickness of a human hair. Credit: Georg Krohne

Bartlau and her colleagues have thus collected numerous pieces of evidence that indicate: Viruses have a major impact on bacterial mortality during the spring bloom. The bacteria are responsible for breaking down the remains of microscopic algae over the course of the bloom, which largely releases the carbon dioxide absorbed by the algae from the atmosphere. “There is a possible link here between the viruses and the , which will certainly be very exciting to investigate,” says Bartlau.

Outnumbered: The tiny viruses crowd closely around their much larger hosts, the bacteria. About ten to 45 percent of bacteria in seawater are infected with viruses. Credit: Georg Krohne

Now also in the laboratory

The researchers also succeeded in cultivating numerous different and previously unknown viruses in the laboratory in Bremen. “We identified ten new genera and also ten new families,” says Bartlau. Many of them—nine of the ten genera and four of the ten families to be exact—did not exist in culture before. “These new isolates now allow for exciting experiments in the lab to further our knowledge of flavophages and the role of viruses in the ocean,” Bartlau emphasizes. And this knowledge then goes far beyond the marine habitat, because some of the virus families isolated here also have members that live in freshwater, sewage and soil.

The research was published in The ISME Journal.



More information:
Nina Bartlau et al, Highly diverse flavobacterial phages isolated from North Sea spring blooms, The ISME Journal (2021). DOI: 10.1038/s41396-021-01097-4

Citation:
The viruses of the North Sea (2021, September 14)
retrieved 15 September 2021
from https://phys.org/news/2021-09-viruses-north-sea.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Astronomers spot the same supernova three times—and predict a fourth sighting in 16 years thumbnail

Hexbyte Glen Cove Astronomers spot the same supernova three times—and predict a fourth sighting in 16 years

Hexbyte Glen Cove

Now you see them, now you don't. Three views of the same supernova appear in the 2016 image on the left, taken by the Hubble Space Telescope. But they're gone in the 2019 image. The distant supernova, named Requiem, is embedded in the giant galaxy cluster MACS J0138. The cluster is so massive that its powerful gravity bends and magnifies the light from the supernova, located in a galaxy far behind it. Called gravitational lensing, this phenomenon also splits the supernova's light into multiple mirror images, highlighted by the white circles in the 2016 image. The multiply imaged supernova disappears in the 2019 image of the same cluster, at right. The snapshot, taken in 2019, helped astronomers confirm the object's pedigree. Supernovae explode and fade away over time. Researchers predict that a rerun of the same supernova will make an appearance in 2037. The predicted location of that fourth image is highlighted by the yellow circle at top left. The light from Supernova Requiem needed an estimated 10 billion years for its journey, based on the distance of its host galaxy. The light that Hubble captured from the cluster, MACS J0138.0-2155, took about four billion years to reach Earth. The images were taken in near-infrared light by Hubble's Wide Field Camera 3. Credit: IMAGE PROCESSING: Joseph DePasquale (STScI)

It’s challenging to make predictions, especially in astronomy. There are however, a few forecasts astronomers can depend on, such as the timing of upcoming lunar and solar eclipses and the clockwork return of some comets.

Now, looking far beyond the solar system, astronomers have added a solid prediction of an event happening deep in intergalactic space: an image of an exploding star, dubbed Supernova Requiem, which will appear around the year 2037. Although this rebroadcast will not be visible to the naked eye, some future telescopes should be able to spot it.

It turns out that this future appearance will be the fourth-known view of the same supernova, magnified, brightened, and split into separate images by a massive foreground cluster of galaxies acting like a cosmic zoom lens. Three images of the supernova were first found from archival data taken in 2016 by NASA’s Hubble Space Telescope.

The multiple images are produced by the monster galaxy cluster’s powerful gravity, which distorts and magnifies the light from the supernova far behind it, an effect called gravitational lensing. First predicted by Albert Einstein, this effect is similar to a glass lens bending light to magnify the image of a distant object.

The three lensed supernova images, seen as tiny dots captured in a single Hubble snapshot, represent light from the explosive aftermath. The dots vary in brightness and color, which signify three different phases of the fading blast as it cooled over time.

“This new discovery is the third example of a multiply imaged supernova for which we can actually measure the delay in arrival times,” explained lead researcher Steve Rodney of the University of South Carolina in Columbia. “It is the most distant of the three, and the predicted delay is extraordinarily long. We will be able to come back and see the final arrival, which we predict will be in 2037, plus or minus a couple of years.”

The light that Hubble captured from the cluster, MACS J0138.0-2155, took about four billion years to reach Earth. The light from Supernova Requiem needed an estimated 10 billion years for its journey, based on the distance of its host galaxy.

The team’s prediction of the supernova’s return appearance is based on computer models of the cluster, which describe the various paths the supernova light is taking through the maze of clumpy dark matter in the galactic grouping. Dark matter is an invisible material that comprises the bulk of the universe’s matter and is the scaffolding upon which galaxies and galaxy clusters are built.

Each magnified image takes a different route through the cluster and arrives at Earth at a different time, due, in part, to differences in the length of the pathways the supernova light followed.

“Whenever some light passes near a very massive object, like a galaxy or galaxy cluster, the warping of space-time that Einstein’s theory of general relativity tells us is present for any mass, delays the travel of light around that mass,” Rodney said.

He compares the supernova’s various light paths to several trains that leave a station at the same time, all traveling at the same speed and bound for the same location. Each train, however, takes a different route, and the distance for each route is not the same. Because the trains travel over different track lengths across different terrain, they do not arrive at their destination at the same time.

In addition, the lensed supernova image predicted to appear in 2037 lags behind the other images of the same supernova because its light travels directly through the middle of the cluster, where the densest amount of dark matter resides. The immense mass of the cluster bends the light, producing the longer time delay. “This is the last one to arrive because it’s like the train that has to go deep down into a valley and climb back out again. That’s the slowest kind of trip for light,” Rodney explained.

The lensed supernova images were discovered in 2019 by Gabe Brammer, a study co-author at the Cosmic Dawn Center at the Niels Bohr Institute, University of Copenhagen, in Denmark. Brammer spotted the mirrored supernova images while analyzing distant galaxies magnified by massive foreground galaxy clusters as part of an ongoing Hubble program called REsolved QUIEscent Magnified Galaxies (REQUIEM).

He was comparing new REQUIEM data from 2019 with archival images taken in 2016 from a different Hubble science program. A tiny red object in the 2016 data caught his eye, which he initially thought was a far-flung galaxy. But it had disappeared in the 2019 images.

“But then, on further inspection of the 2016 data, I noticed there were actually three magnified objects, two red and a purple,” he explained. “Each of the three objects was paired with a lensed image of a distant massive galaxy. Immediately it suggested to me that it was not a distant galaxy but actually a transient source in this system that had faded from view in the 2019 images like a light bulb that had been flicked off.”

Brammer teamed up with Rodney to conduct a further analysis of the system. The lensed supernova images are arranged in an arc around the cluster’s core. They appear as small dots near the smeared orange features that are thought to be the magnified snapshots of the supernova’s host galaxy.

Study co-author Johan Richard of the University of Lyon in France produced a map of the amount of dark matter in the cluster, inferred from the lensing it produces. The map shows the predicted locations of lensed objects. This supernova is predicted to appear again in 2042, but it will be so faint that the research team thinks it will not be visible.

Catching the rerun of the explosive event will help astronomers measure the time delays between all four supernova images, which will offer clues to the type of warped-space terrain the exploded star’s light had to cover. Armed with those measurements, researchers can fine-tune the models that map out the cluster’s mass. Developing precise dark-matter maps of massive galaxy clusters is another way for astronomers to measure the universe’s expansion rate and investigate the nature of dark energy, a mysterious form of energy that works against gravity and causes the cosmos to expand at a faster rate.

This time-delay method is valuable because it’s a more direct way of measuring the universe’s expansion rate, Rodney explained. “These long time delays are particularly valuable because you can get a good, precise measurement of that time delay if you are just patient and wait years, in this case more than a decade, for the final image to return,” he said. “It is a completely independent path to calculate the universe’s expansion rate. The real value in the future will be using a larger sample of these to improve the precision.”

Spotting lensed images of supernovae will become increasingly common in the next 20 years with the launch of NASA’s Nancy Grace Roman Space Telescope and the start of operations at the Vera C. Rubin Observatory. Both telescopes will observe large swaths of the sky, which will allow them to spot dozens more multiply imaged supernovae.

Future telescopes such as NASA’s James Webb Space Telescope also could detect light from supernova Requiem at other epochs of the blast. The team’s results will appear on September 13 in the journal Nature Astronomy.



More information:
Rodney, S.A. et al. A gravitationally lensed supernova with an observable two-decade time delay. Nat Astron (2021). doi.org/10.1038/s41550-021-01450-9

Citation:
Rerun of supernova blast expected to appear in 2037 (2021, September 13)
retrieved 14 September 2021
from https://phys.org/news/2021-09-astronomers-supernova-timesand-fourth-sighting.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Rain helps fight against 'monster' fire in Spain thumbnail

Hexbyte Glen Cove Rain helps fight against ‘monster’ fire in Spain

Hexbyte Glen Cove

About 500 firefighters and 51 water-dropping planes have been tackling the blaze.

Light rain gave some breathing room to crews struggling on Monday to battle a “monster” wildfire in Spain that has killed a firefighter and forced roughly 2,600 people from their homes.

About 500 firefighters and 51 water-dropping planes have been tackling the blaze, which officials believe was started deliberately last Wednesday in the southern Malaga province, the regional government said.

Firefighters were joined on Sunday by some 260 soldiers from the military’s emergency brigade as they battled to control the blaze in the Sierra Bermeja mountains.

TV footage showed who had been evacuated to the town of Ronda breaking into applause as it began to rain.

But firefighters said the rain would not be enough to bring the blaze under control.

“The will not put out the fire,” said regional fire chief Juan Sanchez.

“But in places where we have it under control, it will help shorten the time to extinguish it completely.”

‘Ray of hope’

The fire, described by local emergency services as “complex and exceptional”, has so far destroyed some 8,000 hectares (20,000 acres).

Erratic winds, scorching temperatures and low humidity levels had helped to turn the blaze into a “hungry monster”, the region’s deputy fire chief Alejandro Garcia said last week.

Authorities had on Sunday removed some 1,600 people from six villages as a precaution.

Roughly 1,000 people, who had been evacuated from the coastal resort of Estepona when the blaze first began, were on Monday allowed to go back home.

“Controlling it today seems like too much to ask but there is a ray of hope,” regional environment minister Carmen Crespo told news radio Ser.

Two firefighters were injured from falls on Sunday, the regional government said, while a 44-year-old died on Thursday. His funeral was held on Sunday.

Southern Spain is the latest area around the Mediterranean basin to be hit by wildfires this summer, a seasonal phenomenon that warn will become increasingly common because of man-made .

Large fires have already ravaged parts of Greece, Italy, Turkey, and Algeria.



© 2021 AFP

Citation:
Rain helps fight against ‘monster’ fire in Spain (2021, September 13)
retrieved 14 September 2021
from https://phys.org/news/2021-09-monster-spain.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is prov

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Thousands of tiny anchors keep our cells in place—and now we know how thumbnail

Hexbyte Glen Cove Thousands of tiny anchors keep our cells in place—and now we know how

Hexbyte Glen Cove

Credit: CC0 Public Domain

Most of the cells in our bodies—be they bone, muscle or pancreas cells—are locked into the right place with the help of tiny anchors (called ‘focal adhesions’). These strong anchors use protein chains to link the cell to collagen, the protein that gives structure to our body.

The anchors help the stay put and, for the most part, resist disruptions to their environment—but if a cell morphs into a cancer cell, the chain can break, letting the cancer spread to other parts of the body.

Now, for the first time, a team of UNSW Sydney scientists have found the specific (or link) in the chain responsible for upholding the connection.

The findings, published today in Nature Materials, build on our understanding of cell mechanics—and could help give new directions for .

“We’ve identified the protein that’s essential for these attachments to function,” says Ms Maria Lastra Cagigas, lead author of the study and Scientia Ph.D. candidate at UNSW Medicine & Health.

“If these attachments fail, the cell could be more prone to moving and invading tissues, like cancer.”

Scientists already knew that cancer weakens cells’ anchors in some way, but they didn’t know exactly how this happens.

One of the reasons it’s been so hard to study this is the miniscule size of the anchor’s chain: it’s only a few nanometres thick—about 1/10,000th the size of a human hair.

The team used specialized 3D —a powerful imaging technique that uses an electron microscope to create high-resolution images of cells—to identify tropomyosin as the key protein in the chain holding the anchor in place. Cryo-electron microscopy is currently the most powerful technique to look at proteins inside cells, and its development won the Nobel Prize in Chemistry in 2017.

“This is the first time we can actually see in detail what the anchor’s chain looks like,” says Professor Peter Gunning, co-senior author of the study. The team made the findings at UNSW’s Mark Wainwright Electron Microscope Unit, and are the first in the world to use this technique to look at these tropomyosin chains.

“It’s completely new technology.”

The researchers identified tropomyosin’s role in the anchor’s chain by comparing normal cells with cells from bone cancer patients, along with cancer cells created in the laboratory.

They then tried putting the tropomyosin back into the cancer cells—surprisingly, the anchors managed to attach again.

“Looking into the future, we want to learn if we can leverage this knowledge to reduce the invasion of cancer cells,” says Ms Lastra Cagigas.

“In the short term, we could use this information to find out if a cancer has a predisposition to metastasize, which means to move throughout the body.

“In the long term, we could look into it as a potential target in .”

Prof. Gunning and co-senior author Professor Edna Hardeman, who have been researching this field of science for 40 years, say it’s a milestone in understanding cell mechanics.

“It’s been a real pleasure to watch this work develop,” says Prof. Gunning, who was recently presented with the 2020 President’s Medal from the Australian and New Zealand Society for Cell and Developmental Biology (ANZSCDB) for his contribution to research into cell mechanics.

“It reinforces what has essentially been a lifetime’s work for us: understanding the principles of the architecture of cells.”

A potential drug target

Around 30 percent of the body is made up of collagen, which forms what’s called ‘the matrix’.

“The matrix is like a scaffold present in our bones, ligaments, muscles, and skin. It’s almost everywhere in the body,” says Ms Lastra Cagigas. “Other than the cells that move through our body, like those in blood, the collagen matrix forms the home for most cells—including cancer cells.”

Pancreatic cancer is one of a few cancers that can modify this matrix for its own benefit by creating a ‘barrier’ around the tumor. This barrier works as a defense mechanism, making it harder for cancer treatments like chemotherapy and immunotherapy to kill the .

The tumor forces pancreatic cancer-associated fibroblasts (or PCAFs) – cells around the tumor that are anchored by chains—to build this defense barrier. But now that scientists have identified the proteins in the cell’s anchor and chain, they can explore these proteins as future targets for therapies that could loosen that barrier.

“We’ve identified that the type of protein involved in the chain, tropomyosin, is druggable,” says Prof. Hardeman.

“This means it’s possible to develop small molecule inhibitors, or drugs, that can actually attack these proteins.”

Prof. Hardeman says it’s likely that these potential future drugs would be delivered alongside cancer treatments, so the drugs can temporarily destabilize the barrier while the cancer treatments do their work.

Looking ahead

While the findings are encouraging, Prof. Gunning says it doesn’t mean suitable drugs will be available for use in the next few years.

“We have an understanding of the biology, but to go from that to treating a patient is difficult to predict,” he says.

“We can see what the path looks like, but we are less sure of the timeline.”

It’s more likely that in the near future—potentially the next two or three years—the protein in the chain, tropomyosin, may help scientists predict which cancers are likely to spread more quickly.

“As we build on the underlying mechanisms of cancer and expand our markers of cancer cell biology, our discovery adds a missing link to the development of a personalized diagnosis for ,” says Prof. Gunning.



More information:
Correlative cryo-ET identifies actin/tropomyosin filaments that mediate cell–substrate adhesion in cancer cells and mechanosensitivity of cell proliferation, Nature Materials (2021). DOI: 10.1038/s41563-021-01087-z , www.nature.com/articles/s41563-021-01087-z

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Scientists debate promise, peril of tweaking wild genomes thumbnail

Hexbyte Glen Cove Scientists debate promise, peril of tweaking wild genomes

Hexbyte Glen Cove

Mosquito-borne avian malaria has wiped out 80 percent of bird species in Hawaii.

In the movie Jurassic Park, reconstructing and tweaking genetic material makes it possible to bring dinosaurs back to life.

Today, a technology that manipulates animal genomes, called gene drive, has become a reality. The goal, however, is not to revive long-gone , but to eliminate invasive ones.

Steven Spielberg’s film was set on an imaginary island off the coast of Costa Rica, and it is also on an island that the first open-air experiments in programmed extinction could take place, according to experts gathered at the International Union for the Conservation of Nature (IUCN) Congress in Marseille.

It could happen within a decade, they told AFP.

That’s because fragile island ecosystems are in crisis. Dozens of vertebrate species have vanished in the last century, and dozens more are on a glide path to extinction.

The culprits are non-native rats, snakes and mosquitoes—all introduced by humans, for the most part by accident—that eat , infect birds with disease, or outcompete indigenous amphibians and mammals.

For more than 20 years, Island Conservation has been working to eradicate rodents and other invasive alien species, which are a major threat to biodiversity globally, the organisation’s Royden Saah told AFP.

The conservation NGO has been successful on two Galapagos islands—Seymour North and Mosquera—using traps and poison-delivering drones.

But species eradication using these tools is costly and has no guarantee of success. Rat poison is effective, but poses risks to other species.

‘Obvious ecological risks’

“Should we create a genetically modified rat so that its offspring is only male (or female)?”, Island Conservation asks on its website.

So far, this Franken-rat does not exist.

“But if we don’t do the research, we will not know what the potential of this technology is,” said Royden Saah, who coordinates a team of scientists for the NGO.

At its last Congress in 2016, the IUCN’s 1,400 members created a working group to evaluate the issue from every angle—feasibility, costs and benefits, possible side effects, ethics.

On Friday, following intense debate, the congress endorsed a motion for “synthetic biology”—an umbrella term for genetic engineering that including gene drive—that tilts towards those in favour of continuing with research and experimentation.

How gene drive technology could be used to combat malaria. Includes factbox on malaria worldwide.

“I’m scared about the potential applications of synthetic biology,” the head of the IUCN working group, Kent Redford, told AFP in Marseille before the vote.

“There are obvious ecological risks and concerns regarding genetic modification of wild species”, warned Ricarda Steinbrecher, a geneticist working with Pro-Natura.

That NGO and others such as Friends of the Earth, ETC Group and the Heinrich Boll Foundation have sounded alarms on the dangers of and gene drive.

Scientists themselves cannot agree on the precise boundaries of synbio. Does a modified rat still belong to the same species? At what point does it become a new one?

Avian malaria

For some species, science has explored other options. Take the rhinoceros, careening towards extinction because of a demand in Asia for it’s horn, thought to have medicinal properties.

Scientists can now recreate a molecular facsimile of rhino horn in the lab.

“But people want the real product,” said Steinbrecher.

For some island ecosystems, the situation is no less dire than for the rhino, and that urgency is a problem for the technologies under review.

“While there is the potential, [gene drive] is not going to be here in time to save the birds,” said Samuel Gon, a scientific advisor to the NGO Nature Conservancy.

Of more than 50 known endemic bird species in Hawaii, only 15 remain, and five of those are “critically endangered” on the IUCN Red List of Threatened Species—the last stop before “extinct in the wild”.

The birds were mostly wiped out by avian malaria, brought by mosquitoes that arrived in the 19th century by boat.

Hawaii is poised to use another technology that sterilises mosquitoes by inoculating them with a bacterium, Wolbachia.

Meanwhile, the Jurassic Park scenario is still on the cards.

Researchers in the United States and Russia announced earlier this year that they have successfully sequenced the genome of a million-year-old mammoth.

But the next step remains controversial—should it be brought back to life?



© 2021 AFP

Citation:
Scientists debate promise, peril of tweaking wild genomes (2021, September 11)
retrieved 13 September 2021
from https://phys.org/news/2021-09-scientists-debate-peril-tweaking-wild.html

This document is subject to copyright. Apart from any

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —