Hexbyte Glen Cove Key step reached to­ward long-​sought goal of a silicon-​based laser thumbnail

Hexbyte Glen Cove Key step reached to­ward long-​sought goal of a silicon-​based laser

Hexbyte Glen Cove

Scanning transmission electron microscopy (STEM) images of one of the Ge/SiGe heterostructures at different magnifications. The SiGe layers appear darker. Credit: Università Roma Tre, De Seta Group

When it comes to microelectronics, there is one chemical element like no other: silicon, the workhorse of the transistor technology that drives our information society. The countless electronic devices we use in everyday life are a testament to how today very high volumes of silicon-based components can be produced at very low cost. It seems natural, then, to use silicon also in other areas where the properties of semiconductors—as silicon is one—are exploited technologically, and to explore ways to integrate different functionalities. Of particular interest in this context are diode lasers, such as those employed in barcode scanners or laser pointers, which are typically based on gallium arsenide (GaAs). Unfortunately though, the physical processes that create light in GaAs do not work so well in silicon. It therefore remains an outstanding, and long-standing, goal to find an alternative route to realizing a ‘laser on silicon.’

Writing today in Applied Physics Letters, an international team led by Professors Giacomo Scalari and Jérôme Faist from the Institute for Quantum Electronics present an important step towards such a device. They report electroluminescence—electrical light generation—from a based on silicon-germanium (SiGe), a material that is compatible with standard fabrication processes used for silicon devices. Moreover, the emission they observed is in the terahertz frequency band, which sits between those of microwave electronics and infrared optics, and is of high current interest with a view to a variety of applications.

Make silicon shine

The main reason why silicon cannot be used directly for building a laser following to the GaAs template has to do with the different nature of their band gaps, which is direct in the latter but indirect in the former. In a nutshell, in GaAs electrons recombine with holes across the bandgap producing light; in silicon, they produce heat. Laser action in silicon therefore requires another path. And exploring a fresh approach is what ETH doctoral researcher David Stark and his colleagues are doing. They work towards a silicon-based quantum cascade laser (QCL). QCLs achieve light emission not by electron-hole recombination across the bandgap, but by letting electrons tunnel through repeated stacks of precisely engineered semiconductor structures, during which process photons are emitted.

The QCL paradigm has been demonstrated in a number of materials—for the first time in 1994 by a team including Jérôme Faist, then working at Bell Laboratories in the US—but never in silicon-based ones, despite promising predictions. Turning these predictions into reality is the focus of an interdisciplinary project funded by the European Commission, bringing together a team of leading experts in growing highest-quality semiconductor materials (at the Università Roma Tre), characterizing them (at the Leibniz-Institut für innovative Mikroelektronik in Frankfurt an der Oder) and fabricating them into devices (at the University of Glasgow). The ETH group of Scalari and Faist is responsible for performing the measurements on the devices, but also for the design of the laser, with numerical and theoretical support from partners in the company nextnano in Munich and at the Universities of Pisa and Rome.

As electrons tunnel through the Ge/SiGe heterostructure, they emit light, currently at two slightly different frequencies, due to suboptimal injection in the upper state of the radiative transition. Credit: ETH Zurich/David Stark

From electroluminescence to lasing

With this bundled knowledge and expertise, the team designed and built devices with a unit structure made of SiGe and pure germanium (Ge), less than 100 nanometres in height, which repeats 51 times. From these heterostructures, fabricated with essentially atomically precision, Stark and co-workers detected electroluminescence, as predicted, with the spectral features of the emerging light agreeing well with calculations. Further confidence that the devices work as intended came from a comparison with a GaAs-based structure that was fabricated with identical device geometry. Whereas the emission from the Ge/SiGe structure is still significantly lower than for its GaAs-based counterpart, these results clearly signal that the team is on the right track. The next step will be now to assemble similar Ge/SiGe structures according to a laser design that the team developed. The ultimate goal is to reach room-temperature operation of a silicon-based QCL.

Such an achievement would be significant in several respects. Not only would it, at long last, realize a laser on a silicon substrate, thereby bringing a boost to photonics. The emission of the structure created by Stark et al. is in the terahertz region, for which currently compact light sources are widely missing. Silicon-based QCLs, with their potential versatility and reduced fabrication cost, could be a boon for the large-scale use of terahertz radiation in existing and new fields of application, from medical imaging to wireless communication.



More information:
David Stark et al, THz intersubband electroluminescence from n-type Ge/SiGe quantum cascade structures, Applied Physics Letters (2021). DOI: 10.1063/5.0041327

Citation:
Key step reached to­ward long-​sought goal of a silicon-​based laser (2021, March 8)
retrieved 10 March 2021
from https://phys.org/news/2021-03-key-long-sought-goal-silicon-based-laser.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove A better way to measure acceleration thumbnail

Hexbyte Glen Cove A better way to measure acceleration

Hexbyte Glen Cove

Illustration of an optomechanical accelerometer, which uses light to measure acceleration. The NIST device consists of two silicon chips, with infrared laser light entering at the bottom chip and exiting at the top. The top chip contains a proof mass suspended by silicon beams, which enables the mass to move up and down freely in response to acceleration. A mirrored coating on the proof mass and a hemispherical mirror attached to the bottom chip form an optical cavity. The wavelength of the infrared light is chosen so that it nearly matches the resonant wavelength of the cavity, enabling the light to build in intensity as it bounces back and forth between the two mirrored surfaces many times before exiting. When the device experiences an acceleration, the proof mass moves, changing the length of the cavity and shifting the resonant wavelength. This alters the intensity of the reflected light. An optical readout converts the change in intensity into a measurement of acceleration. Credit: F. Zhou/NIST

You’re going at the speed limit down a two-lane road when a car barrels out of a driveway on your right. You slam on the brakes, and within a fraction of a second of the impact an airbag inflates, saving you from serious injury or even death.

The airbag deploys thanks to an —a sensor that detects sudden changes in velocity. Accelerometers keep rockets and airplanes on the correct flight path, provide navigation for self-driving cars, and rotate images so that they stay right-side up on cellphones and tablets, among other essential tasks.

Addressing the increasing demand to accurately measure acceleration in smaller navigation systems and other devices, researchers at the National Institute of Standards and Technology (NIST) have developed an accelerometer a mere millimeter thick that uses laser light instead of mechanical strain to produce a signal.

Although a few other accelerometers also rely on light, the design of the NIST instrument makes the measuring process more straightforward, providing higher accuracy. It also operates over a greater range of frequencies and has been more rigorously tested than similar devices.

Not only is the NIST device, known as an optomechanical accelerometer, much more precise than the best commercial accelerometers, it does not need to undergo the time-consuming process of periodic calibrations. In fact, because the instrument uses of a known frequency to measure acceleration, it may ultimately serve as a portable reference standard to calibrate other accelerometers now on the market, making them more accurate.

The accelerometer also has the potential to improve inertial navigation in such critical systems as military aircraft, satellites and submarines, especially when a GPS signal is not available. NIST researchers Jason Gorman, Thomas LeBrun, David Long and their colleagues describe their work in the journal Optica.

The study is part of NIST on a Chip, a program that brings the institute’s cutting-edge measurement-science technology and expertise directly to users in commerce, medicine, defense and academia.

Accelerometers, including the new NIST device, record changes in velocity by tracking the position of a freely moving , dubbed the “proof mass,” relative to a fixed reference point inside the device. The distance between the proof mass and the reference point only changes if the accelerometer slows down, speeds up or switches direction. The same is true if you’re a passenger in a car. If the car is either at rest or moving at constant velocity, the distance between you and the dashboard stays the same. But if the car suddenly brakes, you’re thrown forward and the distance between you and the dashboard decreases.

The motion of the proof mass creates a detectable signal. The accelerometer developed by NIST researchers relies on infrared light to measure the change in distance between two highly reflective surfaces that bookend a small region of empty space. The proof mass, which is suspended by flexible beams one-fifth the width of a human hair so that it can move freely, supports one of the mirrored surfaces. The other reflecting surface, which serves as the accelerometer’s fixed reference point, consists of an immovable microfabricated concave mirror.

Together, the two reflecting surfaces and the empty space between them form a cavity in which infrared light of just the right wavelength can resonate, or bounce back and forth, between the mirrors, building in intensity. That wavelength is determined by the distance between the two mirrors, much as the pitch of a plucked guitar depends on the distance between the instrument’s fret and bridge. If the proof mass moves in response to acceleration, changing the separation between the mirrors, the resonant wavelength also changes.

To track the changes in the cavity’s resonant wavelength with high sensitivity, a stable single-frequency laser is locked to the cavity. As described in a recent publication in Optics Letters, the researchers have also employed an optical frequency comb—a device that can be used as a ruler to measure the wavelength of light—to measure the cavity length with high accuracy. The markings of the ruler (the teeth of the comb) can be thought of as a series of lasers with equally spaced wavelengths. When the proof mass moves during a period of acceleration, either shortening or lengthening the cavity, the intensity of the reflected light changes as the wavelengths associated with the comb’s teeth move in and out of resonance with the cavity.

Accurately converting the displacement of the proof mass into an acceleration is a critical step that has been problematic in most existing optomechanical accelerometers. However, the team’s new design ensures that the dynamic relationship between the displacement of the proof mass and the acceleration is simple and easy to model through first principles of physics. In short, the proof mass and supporting beams are designed so that they behave like a simple spring, or harmonic oscillator, that vibrates at a single frequency in the operating range of the accelerometer.

This simple dynamic response enabled the scientists to achieve low measurement uncertainty over a wide range of acceleration frequencies—1 kilohertz to 20 kilohertz—without ever having to calibrate the device. This feature is unique because all commercial accelerometers have to be calibrated, which is time-consuming and expensive. Since the publication of their study in Optica, the researchers have made several improvements that should decrease their device’s uncertainty to nearly 1%.

Capable of sensing displacements of the proof mass that are less than one hundred-thousandth the diameter of a hydrogen atom, the optomechanical accelerometer detects accelerations as tiny as 32 billionths of a g, where g is the acceleration due to Earth’s gravity. That’s a higher sensitivity than all accelerometers now on the market with similar size and bandwidth.

With further improvements, the NIST optomechanical accelerometer could be used as a portable, high-accuracy reference device to calibrate other accelerometers without having to bring them into a laboratory.



More information:
Feng Zhou et al, Broadband thermomechanically limited sensing with an optomechanical accelerometer, Optica (2021). DOI: 10.1364/OPTICA.413117

D. A. Long et al. Electro-optic frequency combs for rapid interrogation in cavity optomechanics, Optics Letters (2020). DOI: 10.1364/OL.405299

Citation:
A better way to measure acceleration (2021, March 8)
retrieved 10 March 2021
from https://phys.org/news/2021-03-a-better-way-to-measure.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Tantalizing signs of phase-change 'turbulence' in RHIC collisions thumbnail

Hexbyte Glen Cove Tantalizing signs of phase-change ‘turbulence’ in RHIC collisions

Hexbyte Glen Cove

The STAR detector at the U.S. Department of Energy’s Brookhaven National Laboratory Credit: Brookhaven National Laboratory

Physicists studying collisions of gold ions at the Relativistic Heavy Ion Collider (RHIC), a U.S. Department of Energy Office of Science user facility for nuclear physics research at DOE’s Brookhaven National Laboratory, are embarking on a journey through the phases of nuclear matter—the stuff that makes up the nuclei of all the visible matter in our universe. A new analysis of collisions conducted at different energies shows tantalizing signs of a critical point—a change in the way that quarks and gluons, the building blocks of protons and neutrons, transform from one phase to another. The findings, just published by RHIC’s STAR Collaboration in the journal Physical Review Letters, will help physicists map out details of these nuclear phase changes to better understand the evolution of the universe and the conditions in the cores of neutron stars.

“If we are able to discover this critical point, then our map of nuclear phases—the nuclear phase diagram—may find a place in the textbooks, alongside that of water,” said Bedanga Mohanty of India’s National Institute of Science and Research, one of hundreds of physicists collaborating on research at RHIC using the sophisticated STAR detector.

As Mohanty noted, studying nuclear phases is somewhat like learning about the solid, liquid, and gaseous forms of water, and mapping out how the transitions take place depending on conditions like temperature and pressure. But with nuclear matter, you can’t just set a pot on the stove and watch it boil. You need powerful particle accelerators like RHIC to turn up the heat.

RHIC’s highest collision energies “melt” ordinary nuclear matter (atomic nuclei made of protons and neutrons) to create an exotic phase called a quark-gluon plasma (QGP). Scientists believe the entire universe existed as QGP a fraction of a second after the Big Bang—before it cooled and the quarks bound together (glued by gluons) to form protons, neutrons, and eventually, atomic nuclei. But the tiny drops of QGP created at RHIC measure a mere 10-13 centimeters across (that’s 0.0000000000001 cm) and they last for only 10-23 seconds! That makes it incredibly challenging to map out the melting and freezing of the matter that makes up our world.

“Strictly speaking if we don’t identify either the phase boundary or the critical point, we really can’t put this [QGP phase] into the textbooks and say that we have a new state of matter,” said Nu Xu, a STAR physicist at DOE’s Lawrence Berkeley National Laboratory.

Mapping nuclear phase changes is like studying how water changes under different conditions of temperature and pressure (net baryon density for nuclear matter). RHIC’s collisions “melt” protons and neutrons to create quark-gluon plasma (QGP). STAR physicists are exploring collisions at different energies, turning the “knobs” of temperature and baryon density, to look for signs of a “critical point.” Credit: Brookhaven National Laboratory

Tracking phase transitions

To track the transitions, STAR physicists took advantage of the incredible versatility of RHIC to collide gold ions (the nuclei of gold atoms) across a wide range of energies.

“RHIC is the only facility that can do this, providing beams from 200 billion electron volts (GeV) all the way down to 3 GeV. Nobody can dream of such an excellent machine,” Xu said.

The changes in turn the collision temperature up and down and also vary a quantity known as net baryon density that is somewhat analogous to pressure. Looking at data collected during the first phase of RHIC’s “beam energy scan” from 2010 to 2017, STAR physicists tracked particles streaming out at each collision energy. They performed a detailed statistical analysis of the net number of protons produced. A number of theorists had predicted that this quantity would show large event-by-event fluctuations as the critical point is approached.

The reason for the expected fluctuations comes from a theoretical understanding of the force that governs quarks and gluons. That theory, known as quantum chromodynamics, suggests that the transition from normal nuclear matter (“hadronic” protons and neutrons) to QGP can take place in two different ways. At high temperatures, where protons and anti-protons are produced in pairs and the net baryon density is close to zero, physicists have evidence of a smooth crossover between the phases. It’s as if protons gradually melt to form QGP, like butter gradually melting on a counter on a warm day. But at lower energies, they expect what’s called a first-order phase transition—an abrupt change like water boiling at a set temperature as individual molecules escape the pot to become steam. Nuclear theorists predict that in the QGP-to-hadronic-matter phase transition, net proton production should vary dramatically as collisions approach this switchover point.

“At high energy, there is only one phase. The system is more or less invariant, normal,” Xu said. “But when we change from high energy to low energy, you also increase the net baryon density, and the structure of matter may change as you are going through the phase transition area.

As physicists turned the collision energy down at RHIC, they expected to see large event-by-event fluctuations in certain measurements such as net proton production–an effect that’s similar to the turbulence an airplane experiences when entering a bank of clouds–as evidence of a “critical point” in the nuclear phase transition. Higher level statistical analyses of the data, including the skew (kurtosis), revealed tantalizing hints of such fluctuations. Credit: Brookhaven National Laboratory

“It’s just like when you ride an airplane and you get into turbulence,” he added. “You see the fluctuation—boom, boom, boom. Then, when you pass the turbulence—the phase of structural changes—you are back to normal into the one-phase structure.”

In the RHIC collision data, the signs of this turbulence are not as apparent as food and drinks bouncing off tray tables in an airplane. STAR physicists had to perform what’s known as “higher order correlation function” statistical analysis of the distributions of particles—looking for more than just the mean and width of the curve representing the data to things like how asymmetrical and skewed that distribution is.

The oscillations they see in these higher orders, particularly the skew (or kurtosis), are reminiscent of another famous phase change observed when transparent liquid carbon dioxide suddenly becomes cloudy when heated, the scientists say. This “critical opalescence” comes from dramatic fluctuations in the density of the CO2—variations in how tightly packed the molecules are.

“In our data, the oscillations signify that something interesting is happening, like the opalescence,” Mohanty said.

Yet despite the tantalizing hints, the STAR scientists acknowledge that the range of uncertainty in their measurements is still large. The team hopes to narrow that uncertainty to nail their critical point discovery by analyzing a second set of measurements made from many more collisions during phase II of RHIC’s beam energy scan, from 2019 through 2021.

The entire STAR collaboration was involved in the analysis, Xu notes, with a particular group of physicists—including Xiaofeng Luo (and his student, Yu Zhang), Ashish Pandav, and Toshihiro Nonaka, from China, India, and Japan, respectively—meeting weekly with the U.S. scientists (over many time zones and virtual networks) to discuss and refine the results. The work is also a true collaboration of the experimentalists with nuclear theorists around the world and the accelerator physicists at RHIC. The latter group, in Brookhaven Lab’s Collider-Accelerator Department, devised ways to run RHIC far below its design energy while also maximizing collision rates to enable the collection of the necessary data at low collision energies.

“We are exploring uncharted territory,” Xu said. “This has never been done before. We made lots of efforts to control the environment and make corrections, and we are eagerly awaiting the next round of higher statistical data,” he said.



More information:
J. Adam et al, Nonmonotonic Energy Dependence of Net-Proton Number Fluctuations, Physical Review Letters (2021). DOI: 10.1103/PhysRevLett.126.092301

Citation:
Tantalizing signs of phase-change ‘turbulence’ in RHIC collisions (2021, March 5)
retrieved 10 March 2021
from https://phys.org/news/2021-03-tantalizing-phase-change-turbulence-rhic-collisions.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Desert country Jordan aims for green with 10-million tree campaign thumbnail

Hexbyte Glen Cove Desert country Jordan aims for green with 10-million tree campaign

Hexbyte Glen Cove

Eucalyptus and carob saplings are planted near the forest of Kufranjah, north of Jordan’s capital Amman, part of a reforestation effort that aims to reach 10 million trees in 10 years

On a bare hill in Jordan’s verdant Ajloun region, dozens of people plant saplings as part of a reforestation effort that aims to reach 10 million trees in 10 years.

“The in our region are beautiful,” says 11-year-old Mohammed al-Ananza, helping his father Mustafa plant a carob sapling.

“It’s a real shame that we have lost so many to fires… We should work together to protect them,” he says as they work near the Kufranjah forest north of the capital Amman.

Forests make up only one percent of the desert kingdom’s territory, according to the agriculture ministry, though Jordan also has an estimated 23 million orchard trees, half of them olives.

Forest fires strike almost every year in the Middle Eastern country due to high summer temperatures, in a trend scientists expect to intensify with climate change.

The blazes are often started by picnickers’ barbeques or carelessly discarded cigarettes.

There were 499 fires in wood and last year alone, according to the agriculture ministry.

“We must make up for what has been lost in the fires,” said Belal Qtishat, head of the nature protection department at the environment ministry.

“It’s the only way to fight desertification and and to protect biodiversity.”

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Decade after Fukushima, Japan towns struggle to rebuild community thumbnail

Hexbyte Glen Cove Decade after Fukushima, Japan towns struggle to rebuild community

Hexbyte Glen Cove

Masakazu Daibo returned to his hometown in Fukushima just last year to reopen a restaurant established by his grandfather

Masakazu Daibo has reopened his family’s eel restaurant in part of Japan declared a no-go zone after the 2011 nuclear disaster, but so far he has barely a single neighbour.

A decade after radiation forced tens of thousands to flee their homes in Fukushima, some towns in the region are still wrestling with the difficult question of how to rebuild a community from scratch.

After the disaster, 12 percent of Fukushima prefecture was off-limits and around 165,000 people fled their homes either under evacuation orders or voluntarily.

Numerous areas have since been declared safe after extensive decontamination, and incentives are being offered to lure people back. But many are reluctant.

Daibo returned just last year, reopening a restaurant established by his grandfather in the town of Namie, around nine kilometres (5.6 miles) from the .

Namie and 11 neighbouring communities were part of an exclusion zone around the plant, and for years Daibo could enter only on brief visits.

“There were no people but the town remained. It was really like a movie set,” the 65-year-old told AFP.

“I heard no human voices, and saw only wild dogs, cows, pigs.”

The radiation that blanketed the region forced him to discard everything in the restaurant.

Some towns in Fukushima are still wrestling with the difficult question of how to rebuild a community from scratch

Contaminated parts of the walls were removed and he lost everything inside, down to the sauce that had been kept cooking since his grandfather opened the business.

Daibo and his wife hesitated about moving back, but after restrictions were lifted in 2017, they decided they would try to revive the past.

“I want everyone to say ‘Oh, this is a long-forgotten flavour,'” when they taste his food, Daibo said.

“I hope that my presence will shine a light on this town.”

‘Survival is our big issue’

But few others have followed suit.

The restaurant is surrounded by empty lots overgrown with weeds. Wooden signboards are piled up next to a toppled bin in the porch of one abandoned building, in what was once downtown.

Restrictions have been lifted on just 20 percent of Namie, and the town’s population is seven percent its former size of 21,000, despite incentives including reduced rents and money for moving and renovation.

Daibo and his wife hesitated about moving back, but after restrictions were lifted in 2017, they decided they would try to revive the past

Around 36 percent of residents are aged 65 or above, higher than the 29 percent national average, and just 30 students attend local elementary and junior-high schools, compared with nearly 1,800 before.

Japan as a whole is battling low birthrates and an ageing population, but the issue is in stark relief in Namie.

“We feel like the future of 20 years from now has arrived suddenly,” said town official Takanori Matsumoto.

Namie hopes to raise its population to 8,000 by 2035, helped by national subsidies of up to two million yen ($18,500) per new family moving to disaster-hit areas.

“Survival as a community is our big issue,” Matsumoto said.

Just over two percent of Fukushima remains under evacuation orders, with the figure for evacuees officially at around 35,700, though some experts believe there could be nearly twice as many.

But there is no deadline for lifting all the evacuation orders, and doubts persist that Fukushima Daichii can be decommissioned on schedule by 2041 at the earliest.

Namie and 11 neighbouring communities were part of an exclusion zone around the Fukushima Daiichi nuclear plant

‘I can’t go back’

For many, fears over lingering radiation and mistrust of the government’s decontamination process are major obstacles to returning.

“It’s not like I won’t go back. It’s more like I can’t go back,” said Megumi Okada, who was pregnant with her third child at the time of the disaster and left despite being outside the official evacuation zone.

“If I were alone, I would go home,” added the 38-year-old, now a mother of four living in Tokyo.

“But as a mother, I strongly feel that I want to avoid risks for my children.”

Around two-thirds of Fukushima evacuees don’t plan to return, according to a 2020 survey by researchers at Kwansei Gakuin University.

“Many people say they can’t trust the decommissioning target, and their distrust of government measures runs deep,” said Yoko Saito, an associate professor on disaster reduction who jointly conducted the survey.

For Megumi Okada and many others, fears over lingering radiation is a major obstacle to returning

The rate of return to reopened areas varies considerably.

In Kawauchi, which lifted its last evacuation order in 2016, the population is now 68 percent of its pre-2011 figure.

It’s a different story in Futaba, which jointly hosts the crippled plant.

A tiny portion of the town was declared open last year—but not a single person has returned.

All roads into the restricted zone are blocked by barricades, and those entering must wear plastic suits and cover their hair and shoes. Radiation levels on their bodies are measured when they leave.

Crumbling buildings, untended because of radiation, dot the region.

At a ruined inn, an antique clock sits stopped, and fallen teacups litter shelves in a nearby giftshop.

‘A little sad and lonely’

For many in reopened areas, returning has brought conflicting feelings.

Around two-thirds of Fukushima evacuees don’t plan to return, according to researchers

Takao Kohata went back to Minamisoma after authorities lifted restrictions but is still haunted by radiation fears.

Government officials tout strict screening of food in the region, but “many people are still nervous,” the 83-year-old said.

The parents of his four grandchildren won’t let them visit, because they worry about radiation.

“I fully understand their concerns, but I feel a little sad and lonely,” he said.

Some evacuees say they feel forced to return as the government winds up support for the displaced.

“In the end, those who have no place to go and have low incomes are the ones left behind,” said Shohei Yamane, a psychiatric social worker supporting evacuees.

“This disaster will never end as long as there are needy evacuees seeking help,” he added.

Some who have returned have found it takes more than reconstruction to rebuild a community.

Yuko Hikichi helps organise gatherings and group exercise sessions to strengthen community ties in Namie.

“We are just at the starting line… Community-building is not an easy job. It is endless,” she said.

It’s a struggle Masaru Kumakawa knows all too well.

He returned to Namie three years ago, despite losing his wife there in the tsunami, and now lives alone in a new housing district.

The 83-year-old heads a community association, but has struggled to make contact with his neighbours.

“They lived in evacuation for too long,” he said at a newly built community centre.

“We ring doorbells but no one comes out.”



© 2021 AFP

Citation:
Decade after Fukushima, Japan towns struggle to rebuild community (2021, March 9)
retrieved 9 March 2021
from https://phys.org/news/2021-03-decade-fukushima-japan-towns-struggle.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove How fast is the universe expanding? Galaxies provide one answer. thumbnail

Hexbyte Glen Cove How fast is the universe expanding? Galaxies provide one answer.

Hexbyte Glen Cove

NGC 1453, a giant elliptical galaxy in the constellation Eridanus, was one of 63 galaxies used to calculate the expansion rate of the local universe. Last year, the MASSIVE survey team determined that the galaxy is located 166 million light years from Earth and has a black hole at its center with a mass nearly 3 billion times that of the sun. Credit: the Carnegie-Irvine Galaxy Survey

Determining how rapidly the universe is expanding is key to understanding our cosmic fate, but with more precise data has come a conundrum: Estimates based on measurements within our local universe don’t agree with extrapolations from the era shortly after the Big Bang 13.8 billion years ago.

A new estimate of the local expansion rate—the Hubble constant, or H0 (H-naught)—reinforces that discrepancy.

Using a relatively new and potentially more precise technique for measuring , which employs the average stellar brightness within giant elliptical galaxies as a rung on the distance ladder, astronomers calculate a rate—73.3 kilometers per second per megaparsec, give or take 2.5 km/sec/Mpc—that lies in the middle of three other good estimates, including the gold standard estimate from Type Ia supernovae. This means that for every megaparsec—3.3 million light years, or 3 billion trillion kilometers—from Earth, the universe is expanding an extra 73.3 ±2.5 kilometers per second. The average from the three other techniques is 73.5 ±1.4 km/sec/Mpc.

Perplexingly, estimates of the local expansion rate based on measured fluctuations in the cosmic microwave background and, independently, fluctuations in the density of normal matter in the early universe (baryon acoustic oscillations), give a very different answer: 67.4 ±0.5 km/sec/Mpc.

Astronomers are understandably concerned about this mismatch, because the expansion rate is a critical parameter in understanding the physics and evolution of the universe and is key to understanding dark energy—which accelerates the rate of expansion of the universe and thus causes the Hubble constant to change more rapidly than expected with increasing distance from Earth. Dark energy comprises about two-thirds of the mass and energy in the universe, but is still a mystery.

For the new estimate, astronomers measured fluctuations in the surface brightness of 63 giant elliptical galaxies to determine the distance and plotted distance against velocity for each to obtain H0. The surface brightness fluctuation (SBF) technique is independent of other techniques and has the potential to provide more precise distance estimates than other methods within about 100 Mpc of Earth, or 330 million light years. The 63 galaxies in the sample are at distances ranging from 15 to 99 Mpc, looking back in time a mere fraction of the age of the universe.

“For measuring distances to galaxies out to 100 megaparsecs, this is a fantastic method,” said cosmologist Chung-Pei Ma, the Judy Chandler Webb Professor in the Physical Sciences at the University of California, Berkeley, and professor of astronomy and physics. “This is the first paper that assembles a large, homogeneous set of data, on 63 galaxies, for the goal of studying H-naught using the SBF method.”

Ma leads the MASSIVE survey of local galaxies, which provided data for 43 of the galaxies—two-thirds of those employed in the new analysis.

The data on these 63 galaxies was assembled and analyzed by John Blakeslee, an astronomer with the National Science Foundation’s NOIRLab. He is first author of a paper now accepted for publication in The Astrophysical Journal that he co-authored with colleague Joseph Jensen of Utah Valley University in Orem. Blakeslee, who heads the science staff that support NSF’s optical and infrared observatories, is a pioneer in using SBF to measure distances to galaxies, and Jensen was one of the first to apply the method at infrared wavelengths. The two worked closely with Ma on the analysis.

“The whole story of astronomy is, in a sense, the effort to understand the absolute scale of the universe, which then tells us about the physics,” Blakeslee said, harkening back to James Cook’s voyage to Tahiti in 1769 to measure a transit of Venus so that scientists could calculate the true size of the solar system. “The SBF method is more broadly applicable to the general population of evolved galaxies in the , and certainly if we get enough galaxies with the James Webb Space Telescope, this method has the potential to give the best local measurement of the Hubble constant.”

The James Webb Space Telescope, 100 times more powerful than the Hubble Space Telescope, is scheduled for launch in October.

Giant elliptical galaxies

The Hubble constant has been a bone of contention for decades, ever since Edwin Hubble first measured the local expansion rate and came up with an answer seven times too big, implying that the universe was actually younger than its . The problem, then and now, lies in pinning down the location of objects in space that give few clues about how far away they are.

Astronomers over the years have laddered up to greater distances, starting with calculating the distance to objects close enough that they seem to move slightly, because of parallax, as the Earth orbits the sun. Variable stars called Cepheids get you farther, because their brightness is linked to their period of variability, and Type Ia supernovae get you even farther, because they are extremely powerful explosions that, at their peak, shine as bright as a whole galaxy. For both Cepheids and Type Ia supernovae, it’s possible to figure out the absolute brightness from the way they change over time, and then the distance can be calculated from their apparent brightness as seen from Earth.

The best current estimate of H0 comes from distances determined by Type Ia supernova explosions in distant galaxies, though newer methods—time delays caused by gravitational lensing of distant quasars and the brightness of water masers orbiting black holes—all give around the same number.

The technique using surface brightness fluctuations is one of the newest and relies on the fact that giant elliptical galaxies are old and have a consistent population of old stars—mostly red giant stars—that can be modeled to give an average infrared brightness across their surface. The researchers obtained high-resolution infrared images of each galaxy with the Wide Field Camera 3 on the Hubble Space Telescope and determined how much each pixel in the image differed from the “average”—the smoother the fluctuations over the entire image, the farther the galaxy, once corrections are made for blemishes like bright star-forming regions, which the authors exclude from the analysis.

Neither Blakeslee nor Ma was surprised that the expansion rate came out close to that of the other local measurements. But they are equally confounded by the glaring conflict with estimates from the early universe—a conflict that many astronomers say means that our current cosmological theories are wrong, or at least incomplete.

The extrapolations from the early universe are based on the simplest cosmological theory—called lambda cold dark matter, or ΛCDM—which employs just a few parameters to describe the evolution of the universe. Does the new estimate drive a stake into the heart of ΛCDM?

“I think it pushes that stake in a bit more,” Blakeslee said. “But it (ΛCDM) is still alive. Some people think, regarding all these local measurements, (that) the observers are wrong. But it is getting harder and harder to make that claim—it would require there to be systematic errors in the same direction for several different methods: supernovae, SBF, gravitational lensing, water masers. So, as we get more independent measurements, that stake goes a little deeper.”

Ma wonders whether the uncertainties astronomers ascribe to their measurements, which reflect both systematic errors and statistical errors, are too optimistic, and that perhaps the two ranges of estimates can still be reconciled.

“The jury is out,” she said. “I think it really is in the error bars. But assuming everyone’s error bars are not underestimated, the tension is getting uncomfortable.”

In fact, one of the giants of the field, astronomer Wendy Freedman, recently published a study pegging the Hubble constant at 69.8 ±1.9 km/sec/Mpc, roiling the waters even further. The latest result from Adam Riess, an astronomer who shared the 2011 Nobel Prize in Physics for discovering dark energy, reports 73.2 ±1.3 km/sec/Mpc. Riess was a Miller Postdoctoral Fellow at UC Berkeley when he performed this research, and he shared the prize with UC Berkeley and Berkeley Lab physicist Saul Perlmutter.

MASSIVE galaxies

The new value of H0 is a byproduct of two other surveys of nearby galaxies—in particular, Ma’s MASSIVE survey, which uses space and ground-based telescopes to exhaustively study the 100 most massive galaxies within about 100 Mpc of Earth. A major goal is to weigh the supermassive black holes at the centers of each one.

To do that, precise distances are needed, and the SBF method is the best to date, she said. The MASSIVE survey team used this method last year to determine the distance to a giant elliptical galaxy, NGC 1453, in the southern sky constellation of Eridanus. Combining that distance, 166 million light years, with extensive spectroscopic data from the Gemini and McDonald telescopes—which allowed Ma’s graduate students Chris Liepold and Matthew Quenneville to measure the velocities of the stars near the center of the galaxy—they concluded that NGC 1453 has a central black hole with a mass nearly 3 billion times that of the sun.

To determine H0, Blakeslee calculated SBF distances to 43 of the galaxies in the MASSIVE survey, based on 45 to 90 minutes of HST observing time for each galaxy. The other 20 came from another survey that employed HST to image large galaxies, specifically ones in which Type Ia supernovae have been detected.

Most of the 63 galaxies are between 8 and 12 billion years old, which means that they contain a large population of old red stars, which are key to the SBF method and can also be used to improve the precision of distance calculations. In the paper, Blakeslee employed both Cepheid variable stars and a technique that uses the brightest red giant stars in a galaxy—referred to as the tip of the red giant branch, or TRGB technique—to ladder up to galaxies at large distances. They produced consistent results. The TRGB technique takes account of the fact that the brightest red giants in have about the same absolute brightness.

“The goal is to make this SBF method completely independent of the Cepheid-

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Etna keeps up its spectacular explosions; ash rains on towns thumbnail

Hexbyte Glen Cove Etna keeps up its spectacular explosions; ash rains on towns

Hexbyte Glen Cove

In this Tuesday, Feb. 16, 2021 file photo, lava gushes from the Mt. Etna volcano near Catania, southern Italy. The spectacular explosions of Mount Etna are continuing Sunday, March 7, 2021, with the volcano in eastern Sicily, spewing out towering clouds of ash and lava stones. Italy’s national geophysics and volcanology institute said the latest big explosion occurred hours before dawn on Sunday, with the volcano exhibiting increasing tremors throughout the night. (AP Photo/Salvatore Allegra, File)

A particularly spectacular blast from Italy’s Mount Etna volcano belched out a towering cloud of ash and lava stone Sunday onto Sicilian villages, the latest in a series of explosions since mid-February.

Italy’s national geophysics and volcanology institute INGV said the powerful explosion at 2 a.m. was the 10th such big since Feb. 16, when Europe’s most started giving off an impressive demonstration of nature’s fire power, coloring the night sky in shocking hues of orange and red.

Increasing tremors rattled the mountain throughout much of the night. Ash and small lava stones rained down on eight villages on Etna’s slopes Sunday morning, while lava flowed from the southeast crater slowly down an uninhabited side, as it has been doing for the last three weeks, the institute said.

The column of ash and lava reached a height of 10,000 meters (33,000 feet) on Sunday, according to scientists who monitor with specialized instruments from an observatory at Etna in eastern Sicily.

Locals swept ash and lava stones from their front steps and balconies. They have taken to covering cars parked outdoors with carpets, blankets and sheets of cardboard to make cleanup easier after each blast. Winds helped carry the ash eastward, INGV said.

In this Wednesday, Feb. 24, 2021 file photo, a fiery river of glowing lava flows on the north-east side of the Mt. Etna volcano engulfed with ashes and smoke near Milo, Sicily. The spectacular explosions of Mount Etna are continuing Sunday, March 7, 2021, with the volcano in eastern Sicily, spewing out towering clouds of ash and lava stones. Italy’s national geophysics and volcanology institute said the latest big explosion occurred hours before dawn on Sunday, with the volcano exhibiting increasing tremors throughout the night. (AP Photo/Salvatore Allegra, File )

No injuries or serious damage have been reported after the recent blasts. Geologically active, Etna occasionally becomes particularly noisy and explosive as it has been lately.

By mid-morning, Etna’s latest display of activity had slowed somewhat with the flow ending, although the volcano was still puffing out “weak emission of ash” from the southeast crater. A few hours later, the volcanic tremors picked up again, INGV said in a statement.

The INGV scientists say there is no way of predicting when this current round of particularly robust volcanic activity might subside.



© 2021 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.

Citation:
Etna keeps up its spectacular explosions; ash rains on towns (2021, March 7)
retrieved 8 March 2021
from https://phys.org/news/2021-03-etna-spectacular-explosions-ash-towns.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Food security: Irradiation and essential oil vapors for cereal treatment thumbnail

Hexbyte Glen Cove Food security: Irradiation and essential oil vapors for cereal treatment

Hexbyte Glen Cove

INRS Professor Monique Lacroix is an expert in sciences applied to food, such as irradiation. Credit: Christian Fleury (INRS)

A combined treatment of irradiation and essential oil vapors could effectively destroy insects, bacteria and mold in stored grains. A team from the Institut national de la recherche scientifique (INRS), led by Professor Monique Lacroix, has demonstrated the effect of this process on insects affecting rice. The study was published in Radiation Physics and Chemistry.

Microorganisms and insects are the main enemies of stored grains. Currently, the industry uses fumigants to destroy them. However, these compounds, which evaporate or decompose into gases into air or water, are threatening human health and the environment. “When grain is fumigated, a small amount of gas is absorbed by the grain and released into the atmosphere. For food irradiation, the treatment is physical. If new molecules are produced, they are no different than those produced by normal processes applied to food, such as heat,” says Professor Monique Lacroix.

Food irradiation refers to the exposure of food to ionizing radiation, including and X-rays. The research team has determined the effectiveness of these two processes, both with and without .

Increasing radiosensitivity

The study aimed at testing whether the energy level of the irradiation source with variable flow rate could affect the dose (or treatment time) needed to kill 90% of insects or molds. The research team showed that gamma rays were more effective against insects than X-rays. In addition, a higher dose rate with gamma rays proved to be more effective than a low dose rate. Moreover, by adding eucalyptus and tea tree essential oils, the effectiveness significantly improved. “With the addition of essential oils, the dose needed was four to six times smaller depending on gamma dose rates. In fact, oils increase the sensitivity of insects to radiation,” says Professor Lacroix.

Similar results can be seen for bacteria and molds, even though they are more resistant to radiation. According to a previous study, microorganisms’ sensitivity to increased by about 1.5 times with the addition of thyme and oregano essential oils.

The team also conducted experiments with essential oil vapors diffused in 5 kg bags of rice. In the future, the team would like to test the process in an industrial setting, through partnership with companies.



More information:
Farah Hossain et al, Radiosensitization of rice weevil Sitophilus oryzae using combined treatments of essential oils and ionizing radiation with gamma-ray and X-Ray at different dose rates, Radiation Physics and Chemistry (2020). DOI: 10.1016/j.radphyschem.2020.109286

Provided by
Institut national de la recherche scientifique – INRS

Citation:
Food security: Irradiation and essential oil vapors for cereal treatment (2021, March 5)
retrieved 8 March 2021
from https://phys.org/news/2021-03-food-irradiation-essential-oil-vapors.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be repr

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove 'I want to ride my bicycle!' People set to change mobility choices post-lockdown thumbnail

Hexbyte Glen Cove ‘I want to ride my bicycle!’ People set to change mobility choices post-lockdown

Hexbyte Glen Cove

Credit: Pixabay/CC0 Public Domain

New research suggests a significant proportion of Trinity’s staff and student population that formerly relied on public transport will now choose to walk or cycle to campus when it fully re-opens after lockdown.

Among the reasons provided for such a shift were that options in Dublin’s center were inadequate even when operating at capacity pre-COVID, and that public health guidelines are not enforced on public , raising safety concerns.

Prior to the COVID-19 pandemic, 16% of those surveyed walked and 10.4% cycled to Trinity. Once the re-opens fully, 27.3% intend to walk and 27.8% intend to cycle.

The research, which synthesized over 2,500 responses from staff and students surveyed in June and July 2020, did however also find that many people still harbor all-too familiar fears around how safe cycling and walking in Dublin is. Some of those people indicated they too would be likely to switch to walking or cycling to Trinity if there were safer options.

“Our research shows how travel to our city center campus may change as restrictions ease. One of the key findings is that as travel returns to the campus that it is likely to be done by more active modes with some reluctance to use public transport options,” said Professor Brian Caulfield, from Trinity’s School of Engineering.

“Trinity’s campus is one of the most sustainable in the world when it comes to mobility with less than 1% of staff driving to the campus each day. This research demonstrates the appetite of staff and students to embrace active modes of transport when returning to the campus and also shows the need for the plans that Dublin City Council is implementing in the city to ensure that social distancing can still take place.

“Given that there will be a significant reduction in public transport capacity for the foreseeable future it is crucial that we continue working together to promote active modes of transport. By enabling remote learning and working we can bridge this gap that the reduction in public transport capacity creates.”

Most people would like to continue working/studying from home at least some of the time

The research, just published in the journal of Case Studies on Transport Policy, also provides an interesting snapshot into the minds of a city center workforce and student body adjusting to life working and studying from home, and considering how the ‘new normal’ may affect their attitudes in the future.

The majority of staff members are keen to continue working from home at least some of the time post-lockdown (29.9% preferring to do so one or two days a week, and 60.4% preferring to do so 3+ days a week).

Although students are less keen to continue studying from home some of the time, the majority would prefer not to be on campus all the time (26.1% would prefer to study from home 3+ days a week, and 43.2% one or two days a week).

“This research emerges from a collaboration between researchers in all three Faculties within Trinity College Dublin, Healthy Trinity, and Dublin City Council,” added Sarah Bowman, Director of Strategic Engagement & Impact Assessment, and a co-author of the paper. “It’s a model of cross-sector, transdisciplinary research that seeks to provide data and analyses that encourage evidence-informed decision making.”



More information:
Brian Caulfield et al. Re-open our City and Campus post-Covid: A case study of Trinity College Dublin, the University of Dublin, Case Studies on Transport Policy (2021). DOI: 10.1016/j.cstp.2021.02.016

Citation:
‘I want to ride my bicycle!’ People set to change mobility choices post-lockdown (2021, March 5)
retrieved 8 March 2021
from https://phys.org/news/2021-03-bicycle-p

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Study marks major milestone for Louisiana coastal plan thumbnail

Hexbyte Glen Cove Study marks major milestone for Louisiana coastal plan

Hexbyte Glen Cove

This May 1, 2019, file photo shows the Davis Pond Diversion emptying into Lake Cataouache, with tree growth on the edges of the channels in St. Charles Parish, La. A nearly $2 billion plan to divert water and sediment from the Mississippi River to rebuild land in southeastern Louisiana, a proposal considered the cornerstone of the state’s efforts to protect its rapidly eroding coast, has passed a major milestone with the publication of the Army Corps of Engineers long-awaited environmental impact study, Thursday, March 4, 2021. (AP Photo/Gerald Herbert, File)

A nearly $2 billion plan to divert water and sediment from the Mississippi River to rebuild land in southeastern Louisiana—considered the cornerstone of the state’s efforts to protect its rapidly eroding coast—has passed a major milestone with the publication of the long-awaited Army Corps of Engineers environmental impact study.

The report issued late Thursday spells out the potential benefits of the plan projected to build thousands of acres of freshwater marshes on the western side of the Mississippi River that will help knock down storm surge in the New Orleans area to the north. But it will also have damaging effects on the state’s important shrimping and oyster industries as well as dolphins and some coastal communities.

The Mid-Barataria Sediment Diversion will essentially cut a hole into the levee around the town of Ironton, southeast of New Orleans, and channel part of the river’s flow into Barataria Bay. As the water moves into the bay, sediment settles and eventually accumulates into land. It’s considered the most important project of the state’s $50 billion coastal protection plan that details various measures the state is taking to protect a coast that has lost 2,000 square miles (5,200 square kilometers) since 1932.

“This project is a lifeline for coastal Louisiana. … It is the shot in the arm that our coast needs right now,” said Chip Kline, who chairs the agency charged with protecting the state’s coast.

The concept has been discussed for decades and essentially tries to recreate nature. Southeastern Louisiana is formed of sediment that washed down the Mississippi River and deposited over thousands of years. But as the river was leveed to protect surrounding communities, the sediment instead washed into the Gulf of Mexico.

The loss of sediment coupled with canals dug for oil and gas development and navigation allowed saltwater to push farther and farther into areas like Barataria Bay.

This May 1, 2019, file photo, shows the Davis Pond Diversion, a project that diverts water from the Mississippi River, left, into the Barataria Basin to reduce coastal erosion in St. Charles Parish, La. A nearly $2 billion plan to divert water and sediment from the Mississippi River to rebuild land in southeastern Louisiana, a proposal considered the cornerstone of the state’s efforts to protect its rapidly eroding coast, has passed a major milestone with the publication of the Army Corps of Engineers long-awaited environmental impact study, Thursday, March 4, 2021. (AP Photo/Gerald Herbert, File)

To build the diversion, the state must get permission from the Army Corps of Engineers which issues a draft environmental impact statement spelling out the potential impacts of the project and the state suggests how it will mitigate those.

The money to build the diversion comes from fines BP paid after the 2009 Deepwater Horizon oil spill that sent millions of gallons into the Gulf of Mexico and killed 11 rig workers. Barataria Bay was ground zero for the oil damage.

The group that oversees how the oil spill fines are spent has released a report detailing the diversion’s benefits and impacts and why they support going ahead with it.

At top capacity, the project would divert 75,000 cubic feet (2,125 cubic meters) of water per second from the Mississippi River through a 2-mile-long (3-kilometer-long) concrete channel into the bay. Gates at the river’s edge would control the flow and the state would have an extensive monitoring system to measure things like salinity levels in the bay.

A final permit could be issued as early as next year after public comments.

According to the report, the diversion could build as much as 17,300 acres (27 square miles) by 2050, although the acreage would drop in later years because of sea level rise and subsidence. The resulting land and marsh would help to knock down storm surge by as much as 1 foot (.3 meters) in the New Orleans area.

This May 1, 2019, file photo shows the Davis Pond Diversion in St. Charles Parish, La. A nearly $2 billion plan to divert water and sediment from the Mississippi River to rebuild land in southeastern Louisiana, a proposal considered the cornerstone of the state’s efforts to protect its rapidly eroding coast, has passed a major milestone with the publication of the Army Corps of Engineers long-awaited environmental impact study, Thursday, March 4, 2021. (AP Photo/Gerald Herbert, File)

The strongest opposition to the diversion has come from members of the state’s fishing industry, particularly shrimpers and oyster growers that are an integral part of the state’s culture, history and economy. They worry about the effects of adding so much freshwater to an area that’s largely saltwater right now.

The Corps study said the change in salinity levels in the bay would have major, adverse impacts on eastern oysters and brown shrimp.

The state is proposing to help oyster harvesters and shrimp fishermen adapt to the changes with measures such as paying for refrigeration units for shrimping boats that have to travel farther to shrimp or paying to establish more oyster seed grounds in different areas.

Brad Robin’s family opened Robin’s Seafood in 1947 but have been fishing in the coastal waters even longer than that. His business now focuses exclusively on oysters, and he’s a member of the state’s oyster task force. He’s worried that the diversion will devastate the oyster harvesting on the western side of the river the way that the extended opening of the Bonnet Carre Spillway in 2019 devastated oysters on the eastern side of the river.

“The question I ask all the time,” he said. “Just tell us the truth: Where do we fit in?”

The report also said the diversion could reduce the number of dolphins that live in the bay by a third over two decades. The state is proposing various mitigation efforts such as expanding its network to monitor for stranded dolphins and putting money toward reducing other stresses.

The diversion will also increase tidal flooding in many of the small fishing communities farther south along the river, including some minority and low-income communities. State mitigation efforts include raising roads, houses and infrastructure or in some cases buying out properties.



© 2021 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.

Citation:
Study marks major milestone for Louisiana coastal plan (2021, March 6)
retrieved 7 March 2021
from https://phys.org/news/2021-03-major-milestone-louisiana-coastal.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —