Hexbyte Glen Cove Big tech data centers spark worry over scarce Western water

Hexbyte Glen Cove

In this Tuesday, Oct. 5, 2021, photo, The Dalles Mayor Richard Mays looks at the view of his town and the Columbia River from his hilltop home in The Dalles, Oregon. Mays helped negotiate a proposal by Google to build new data centers in the town. The data centers require a lot of water to cool their servers, and would use groundwater and surface water, but not any water from the Columbia River. Credit: AP Photo/Andrew Selsky

Conflicts over water are as old as history itself, but the massive Google data centers on the edge of this Oregon town on the Columbia River represent an emerging 21st century concern.

Now a critical part of modern computing, data centers help people stream movies on Netflix, conduct transactions on PayPal, post updates on Facebook, store trillions of photos and more. But a single facility can also churn through millions of gallons of water per day to keep hot-running equipment cool.

Google wants to build at least two more data centers in The Dalles, worrying some residents who fear there eventually won’t be enough water for everyone—including for area farms and fruit orchards, which are by far the biggest users.

Across the United States, there has been some mild pushback as tech companies build and expand data centers—conflicts likely to grow as water becomes a more precious resource amid the threat of climate change and as the demand for cloud computing grows. Some tech giants have been using cutting-edge research and development to find less impactful cooling methods, but there are those who say the companies can still do more to be environmentally sustainable.

The concerns are understandable in The Dalles, the seat of Wasco County, which is suffering extreme and exceptional drought, according to the U.S. Drought Monitor. The region last summer endured its hottest days on record, reaching 118 degrees Fahrenheit (48 Celsius) in The Dalles.

This Tuesday, Oct. 5, 2021, photo, shows the historic Granada Theater building in downtown The Dalles, Oregon. As demand for cloud computing grows, the world’s biggest tech companies are building more data centers, including in arid regions even though they use vast amounts of water per day. Some residents of The Dalles, Oregon, are objecting to a proposal by Google to build more data centers there, fearing that, amid rising temperatures and drought, there won’t be enough water for everyone. Credit: AP Photo/Andrew Selsky

The Dalles is adjacent to the the mighty Columbia River, but the new data centers wouldn’t be able to use that water and instead would have to take water from rivers and groundwater that has gone through the city’s water treatment plant.

However, the snowpack in the nearby Cascade Range that feeds the aquifers varies wildly year-to-year and glaciers are melting. Most aquifers in north-central Oregon are declining, according to the U.S. Geological Survey Groundwater Resources Program.

Adding to the unease: The 15,000 town residents don’t know how much water the proposed data centers will use, because Google calls it a trade secret. Even the town councilors, who are scheduled to vote on the proposal on Nov. 8, had to wait until this week to find out.

Dave Anderson, public works director for The Dalles, said Google obtained the rights to 3.9 million gallons of water per day when it purchased land formerly home to an aluminum smelter. Google is requesting less water for the new data centers than that amount and would transfer those rights to the city, Anderson said.

In this Tuesday, Oct. 5, 2021, photo, people walk in the streets undergoing a renewal project to fix decayed buildings downtown The Dalles, Oregon. As demand for cloud computing grows, the world’s biggest tech companies are building more data centers, including in arid regions even though they use vast amounts of water per day. Some residents of The Dalles, Oregon, are objecting to a proposal by Google to build more data centers there, fearing that, amid rising temperatures and drought, there won’t be enough water for everyone. Credit: AP Photo/Andrew Selsky

“The city comes out ahead,” he said.

For its part, Google said it’s “committed to the long-term health of the county’s economy and natural resources.”

“We’re excited that we’re continuing conversations with local officials on an agreement that allows us to keep growing while also supporting the community,” Google said, adding that the expansion proposal includes a potential aquifer program to store water and increase supply during drier periods.

The U.S. hosts 30% of the world’s data centers, more than any other country. Some data centers are trying to become more efficient in water consumption, for example by recycling the same water several times through a center before discharging it. Google even uses treated sewage water, instead of using drinking water as many data centers do, to cool its facility in Douglas County, Georgia.

In this Tuesday, Oct. 5, 2021, photo, Dawn Rasmussen stands at her well at her property on the outskirts of The Dalles, Oregon. She says the water table that her well draws from has dropped 15 feet in the last 15 years. She has deep concerns about Google’s proposal to build more data centers, which use vast amounts of water, in the town. The city council is expected to vote soon on Google’s proposal. As demand for cloud computing grows, the world’s biggest tech companies are building more data centers, including in arid regions even though they use vast amounts of water per day. Credit: AP Photo/Andrew Selsky

Facebook’s first data center took advantage of the cold high-desert air in Prineville, Oregon, to chill its servers, and went a step further when it built a center in Lulea, Sweden, near the Arctic Circle.

Microsoft even placed a small data center, enclosed in what looks like a giant cigar, on the seafloor off Scotland. After retrieving the barnacle-encrusted container last year after two years, company employees saw improvement in overall reliability because the servers weren’t subjected to temperature fluctuations and corrosion from oxygen and humidity. Team leader Ben Cutler said the experiment shows data centers can be kept cool without tapping freshwater resources.

A study published in May by researchers at Virginia Tech and Lawrence Berkeley National Laboratory showed one-fifth of data centers rely on water from moderately to highly stressed watersheds.

Tech companies typically consider tax breaks and availability of cheap electricity and land when placing data centers, said study co-author Landon Marston, assistant professor of civil and environmental engineering at Virginia Tech.

In this Tuesday, Oct. 5, 2021, photo, shows the exterior of a Google data center in The Dalles, Oregon. The Dalles City Council member Long-Curtiss wants to know more details about Google’s proposal to build more data centers in the town before the city council votes on the matter. As demand for cloud computing grows, the world’s biggest tech companies are building more data centers, including in arid regions even though they use vast amounts of water per day. Some residents of The Dalles, Oregon, are objecting to a proposal by Google to build more data centers there, fearing that, amid rising temperatures and drought, there won’t be enough water for everyone. Credit: AP Photo/Andrew Selsky

They need to consider water impacts more seriously, and put the facilities in regions where they can be better sustained, both for the good of the environment and their own bottom line, Marston said.

“It’s also a risk and resilience issue that data centers and their operators need to face, because the drought that we’re seeing in the West is expected to get worse,” Marston said.

About an hour’s drive east of The Dalles, Amazon is giving back some of the water its massive data centers use. Amazon’s sprawling campuses, spread between Boardman and Umatilla, Oregon, butt up against farmland, a cheese factory and neighborhoods. Like many data centers, they use water primarily in summer, with the servers being air-cooled the rest of the year.

About two-thirds of the water Amazon uses evaporates. The rest is treated and sent to irrigation canals that feed crops and pastures.

In this Tuesday, Oct. 5, 2021, photo, shows the exterior of a Google data center in The Dalles, Oregon. The Dalles City Council member Long-Curtiss wants to know more details about Google’s proposal to build more data centers in the town before the city council votes on the matter. Credit: AP Photo/Andrew Selsky

Umatilla City Manager Dave Stockdale appreciates that farms and ranches are getting that water, since the main issue the city had as Amazon’s facilities grew was that the city water treatment plant couldn’t have handled the data centers’ discharge.

John DeVoe, executive director of WaterWatch of Oregon, which seeks reform of water laws to protect and restore rivers, criticized it as a “corporate feel good tactic.”

“Does it actually mitigate for any harm of the server farm’s actual use of water on other interests who may also be using the same source water, like the environment, fish and wildlife?” DeVoe said.

Adam Selipsky, CEO of Amazon Web Services, insists that Amazon feels a sense of responsibility for its impacts.

“We have intentionally been very conscious about water usage in any of these projects,” he said, adding that the centers brought economic activity and jobs to the region.

  • This Tuesday, Oct. 5, 2021, photo, shows the land beyond the fending where Google wants to build more data centers at the site of a former aluminum smelter in The Dalles, Oregon. The Dalles City Council member Long-Curtiss wants to know more details about Google’s proposal to build more data centers in the town before the city council votes on the matter. As demand for cloud computing grows, the world’s biggest tech companies are building more data centers, including in arid regions even though they use vast amounts of water per day. Some residents of The Dalles, Oregon, are objecting to a proposal by Google to build more data centers there, fearing that, amid rising temperatures and drought, there won’t be enough water for everyone. Credit: AP Photo/Andrew Selsky
  • In this Tuesday, Oct. 5, 2021, photo, The Dalles city Public Works Department Director Dave Anderson, left, and Mayor Richard Mays pose for photos with the Columbia River in The Dalles, Oregon. Mays helped negotiate a proposal by Google to build new data centers in the town. The data centers require a lot of water to cool their servers, and would use groundwater and surface water, but not any water from the Columbia River. Credit: AP Photo/Andrew Selsky
  • In this Tuesday, Oct. 5, 2021, photo, The Dalles city Public Works Department Director Dave Anderson, left, and Mayor Richard Mays pose for photos in The Dalles, Oregon. Mays helped negotiate a proposal by Google to build new data centers in the town. The data centers require a lot of water to cool their servers, and would use groundwater and surface water, but not any water from the Columbia River. Credit: AP Photo/Andrew Selsky

Dawn Rasmussen, who lives on the outskirts of The Dalles, worries that her town is making a mistake in negotiating with Google, likening it to David versus Goliath.

She’s seen the level of her well-water drop year after year and worries sooner or later there won’t be enough for everyone.

“At the end of the day, if there’s not enough water, who’s going to win?” she asked.



© 2021 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.

Citation:
Big tech data centers spark worry over scarce Western water (2021, October 22)
retrieved 23 October 2021
from https://phys.org/news/2021-10-big-tech-centers-scarce-western.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove NASA targeting Feb. 2022 to launch new lunar program Artemis

Hexbyte Glen Cove

A NASA employee holds the official Artemis mission patch at NASA Plum Brook Station in Sandusky, Ohio.

NASA said Friday it is now targeting February 2022 for the uncrewed lunar mission Artemis 1, the first step in America’s plan to return humans to the Moon later this decade.

The had initially wanted to launch the by the end of this year, with astronauts on the ground by 2024 on Artemis 3, but the timeline has slipped back.

It achieved a major milestone Wednesday when it stacked the Orion crew capsule atop its Space Launch System megarocket, which now stands 322 feet (98 meters) tall inside the Vehicle Assembly Building at NASA Kennedy Space Center in Florida.

After further tests, it will be wheeled out to the for a final test known as the “wet dress rehearsal” in January, with the first window for launch opening in February, officials told reporters on a call.

“The February launch period opens on the 12th and our last opportunity in February is on the 27th,” said Mike Sarafin, Artemis 1 manager.

The next windows are in March and then April.

These potential launch periods are dependent on orbital mechanics and the relative position of the Earth with respect to its .

The mission duration is expected to be four to six weeks.

It will also deploy a number of small satellites, known as CubeSats, to perform experiments and technology demonstrations.

Although likely to be pushed back, Artemis 2 is technically scheduled for 2023 and Artemis 3 for 2024, humanity’s return to the Moon for the first time since the Apollo 17 mission in 1972.

NASA says the moonwalkers will include the first woman and first person of color to make the trip.

The agency is seeking to establish a sustainable presence on the Moon, and use the lessons it learns to plan a crewed trip to Mars in the 2030s.



© 2021 AFP

Citation:
NASA targeting Feb. 2022 to launch new lunar program Artemis (2021, October 22)
retrieved 23 October 2021
from https://phys.org/news/2021-10-nasa-feb-lunar-artemis.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove NASA launches tool that measures Western water loss

Hexbyte Glen Cove

Credit: Pixabay/CC0 Public Domain

NASA on Thursday launched an online platform with information on how much water evaporates into the atmosphere from plants, soils and other surfaces in the U.S. West, data it says could help water managers, farmers and state officials better manage resources in the parched region.

The platform, OpenET, uses satellite imagery from the Landsat program, a decades-long project of NASA and the U.S. Geological Survey that records human and natural impacts on Earth’s surface.

Specifically, it provides data for 17 Western states—down to the quarter-acre—on how much evapotranspiration has taken place. That’s the process by which moisture in leaves, soil and other surfaces evaporates into the air.

The West has been mired in drought for more than two decades. Scientists say human-caused climate change has intensified conditions. Water levels at key reservoirs on the Colorado River have fallen to historic lows alongside growing demand, prompting the federal government to declare for some states next year. A blazing summer and years of record-breaking wildfires have also zapped moisture from the ground.

Detailed information on soil moisture could help farmers and water managers better plan during dry conditions and reduce how much water is used for irrigation, NASA scientists said on a Thursday call with reporters.

“Farmers and have not had consistent, timely data on one of the most important pieces of information for managing water, which is the amount of water that’s consumed by crops and other plants as they grow,” said Robyn Grimm, a water specialist with the Environmental Defense Fund, which helped NASA develop the tool alongside other environmental groups and Google.

“To date, that data has been expensive and fragmented,” she said.

Many large farms in dry areas, such as California’s Central Valley, already have years of experience using advanced data systems to measure evapotranspiration and other water metrics that influence their growing and harvesting seasons and watering schedules.

Cannon Michael runs an 11,000-acre (4,452 hectare) farm in Merced County, California, that produces tomatoes, melons, cotton and alfalfa. Michael said he looked at NASA’s new platform, but didn’t think it would provide any additional benefit for his farm.

“We closely monitor and understand our water use,” he said. “Our farm is 75% , and we have a very detailed scheduling and forecasting process already in place.”

Meanwhile, Colorado rancher Joe Stanko in Steamboat Springs had read about the new tool in a magazine. Her family grows hay for their cattle, and she said the platform could help them determine which fields need more to replenish soil. It could also help them decide when to harvest hay.

NASA said the platform includes historical data dating back to 1984. In coming months, it will be updated to include information about precipitation rates with the same level of detail. Eventually, the tool will extend to other parts of the U.S., including areas around the Mississippi River and Appalachian region, scientists said.



© 2021 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.

Citation:
NASA launches tool that measures Western water loss (2021, October 22)

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove When and why did human brains decrease in size 3,000 years ago? Ants may hold clues

Hexbyte Glen Cove

Credit: Pixabay/CC0 Public Domain

The brain is the most complex organ in the human body. Now, a new study has brought us closer to understanding some of its evolution. It shows that human brains decreased in size approximately 3,000 years ago. By studying ants as models to illustrate why brains may increase or decrease in size, the researchers hypothesize that brain shrinkage parallels the expansion of collective intelligence in human societies.

Studying and understanding the causes and consequences of brain helps us understand the nature of humanity. It is well documented that have increased in size over the course of our evolutionary history. Less appreciated is the fact that human brains have decreased in size since the Pleistocene. When exactly these changes happened, or why, was not well known.

“A surprising fact about humans today is that our brains are smaller compared to the brains of our Pleistocene ancestors. Why our brains have reduced in size has been a big mystery for anthropologists,” explained co-author Dr. Jeremy DeSilva, from Dartmouth College.

To disentangle this mystery, a team of researchers from different academic fields set out to study the historical patterns of human brain evolution, comparing their findings with what is known in ant societies to offer broad insights.

“A biological anthropologist and a behavioral ecologist and evolutionary neurobiologist began sharing their thoughts on brain evolution and found bridging research on humans and ants might help identify what is possible in nature,” said co-author Dr. James Traniello, from Boston University.

Their paper, published in Frontiers in Ecology and Evolution, sheds new light on the evolution of our brain.

A recent size decrease

The researchers applied a change-point analysis to a dataset of 985 fossil and modern human crania. They found that human brains increased in size 2.1 million years ago and 1.5 million years ago, during the Pleistocene, but decreased in size around 3,000 years ago (Holocene), which is more recent than previous estimates.

“Most people are aware that humans have unusually large brains—significantly larger than predicted from our body size. In our deep evolutionary history, human brain size dramatically increased,” said Traniello. “The reduction in human brain size 3,000 years ago was unexpected.”

The timing of size increase coincides with what is previously known about the early evolution of Homo and the technical advancements that led to; for example, better diet and nutrition and larger social groups.

As for the decrease in brain size, the interdisciplinary team of researchers propose a new hypothesis, finding clues within ant societies.

What could ants teach us about human brain evolution?

“We propose that ants can provide diverse models to understand why brains may increase or decrease in size due to social life. Understanding why brains increase or decrease is difficult to study using only fossils,” explained Traniello.

Studying computational models and patterns of worker ant brain size, structure, and energy use in some ant clades, such as the Oecophylla weaver ant, Atta leafcutter ants, or the common garden ant Formica, showed that group-level cognition and division of labor may select for adaptive brain size variation. This means that within a social group where knowledge is shared or individuals are specialists at certain tasks, brains may adapt to become more efficient, such as decreasing in size.

“Ant and human societies are very different and have taken different routes in social evolution,” Traniello said. “Nevertheless, also share with humans important aspects of social life such as group decision-making and division of labor, as well as the production of their own food (agriculture). These similarities can broadly inform us of the factors that may influence changes in human brain size.”

Brains use up a lot of energy, and smaller brains use less energy. The externalization of knowledge in , thus needing less energy to store a lot of information as individuals, may have favored a decrease in size.

“We propose that this decrease was due to increased reliance on collective intelligence, the idea that a group of people is smarter than the smartest person in the group, often called the ‘wisdom of the crowds,'” added Traniello.

DeSilva concluded, “We look forward to having our hypothesis tested as additional data become available.”



More information:
Jeremy DeSilva et al, When and Why Did Human Brains Decrease in Size? A New Change-Point Analysis and Insights from Brain Evolution in Ants, Frontiers in Ecology and Evolution (2021). DOI: 10.3389/fevo.2021.742639

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Study: Only 2% of global giving goes to curb climate change

Hexbyte Glen Cove

Credit: CC0 Public Domain

Global philanthropic spending to help halt climate change grew last year—but still remains less than 2% of all giving, according to a new report from the ClimateWorks Foundation.

That’s a really low number, especially as the detrimental effects of change have become clearer, says Surabi Menon, vice president of global intelligence at ClimateWorks and a co-author of the report. “With the extreme events taking place, we thought it was shocking that there was so little money.”

Giving by foundations and individual donors to mitigate climate change grew by 14% in 2020, according to the report, “Funding Trends 2021: Climate Change Mitigation Philanthropy.”

In 2019, the group estimates that $5 billion to $9 billion went to curb climate change. Last year, it grew to $6 billion to $10 billion. That estimate does not include the nearly $800 million that the Bezos Earth Fund gave to climate-change efforts at the end of 2020 because the announcement came late in the year and the group did not have the data it needed to assess those gifts.

The failure to invest in efforts to fight the climate crisis is not due to a lack of interest from grant makers, says Phil Buchanan, president of the Center for Effective Philanthropy, a research and consulting organization that advises foundations on their grant making. In 2016, his group surveyed foundation leaders about what they saw as the biggest challenges facing society. Climate change was second only to inequality, he says.

The issue may be hard for foundations to address in part because so many grant makers focus their work on regional and local issues, he says. They may not see such a massive global problem as part of their mission or see how local efforts can have an impact on a complex international issue.

In addition, foundations often want to see quantifiable results where they can know that their support produced a particular result, Buchanan says. Climate change—a fraught global issue with very few efforts resulting in short-term identifiable change—does not lend itself well to this mind-set.

“There’s nothing good about the fact that we’re now seeing, with increasing regularity, the obvious impacts of climate change across communities,” Buchanan says. “But maybe that’s the wake-up call that will finally get more philanthropy unleashed toward doing something about it.”

Johannes Lundershausen, the climate knowledge lead at Active Philanthropy, a German organization that advises donors from around the world on how to effectively support efforts to curb climate change, acknowledges that the 2% figure is disappointing, especially given the need for profound changes in this decade.

He says there need to be more ways for donors to get involved without having to start from scratch and those that are already working on climate need to do a better job of collaborating. But, he says, because the issue is relatively new, it’s understandable that the total amount of support is small. “The trajectory is right,” he says. “There has been progress.”

European foundation funding for climate mitigation outside of Europe tripled from 2016 to 2020, according to a report from ClimateWorks, the European Foundation Centre, and the Hour Is Late, released last week.

Menon, too, sees interest increasing. This year, several large foundations have made pledges focused on climate change, including the $10 billion Earth Fund and the Protecting Our Planet Challenge, in which nine foundations pledged to spend $5 billion on conservation over a decade.

The report also found that donors who in 2018 at the Global Climate Action Summit pledged to give $4 billion to climate solutions are meeting those commitments and more have joined them, increasing the pledge to $6 billion to be given by 2025.

The bulk of the money given to climate causes was from individual donors. Foundations accounted for only $1.9 billion of climate-change donations in 2020, but that was up from about $900 million in 2015.

The report breaks down giving by sector and region in an effort to identify gaps in funding and areas in which donors can help spur progress.

In the past, a lot of donors have supported efforts to make the power sector greener and help it move away from coal and other carbon-based fuels, Menon says. But now, she says, there are real opportunities for philanthropy to boost electric transportation and curb greenhouse-gas emissions in the manufacturing and industrial sectors as well as in the food system, where methane gas, a particularly potent contributor to climate change, is an issue.

The report also discussed foundation support for racial-justice efforts that it said attracted more than $500 million in the United States in the first half of 2020. Indigenous people in the United States are 48% more likely to live in areas that are impacted by flooding from sea-level rise, according to an Environmental Protection Agency analysis cited in the report. The EPA also found that Latinos are 43% more likely to lose income due to increased heat and that mortality rates will increase for Black people due to climate change.

The report cited another study that found that in 2016 and 2017 just 1.3% of grants from 12 large environmental grant makers went to environmental-justice groups. But ClimateWorks did not break out funding for those groups in its own report.

“That was one piece we wanted to highlight this year, but we just don’t have the data,” Menon says. ClimateWorks doesn’t have enough information about the environmental-justice movement to accurately determine the amount of money it is getting and how it is being used, she says. “We’d rather work with another entity, other experts who can tell us how to look at the funding in that space and help us.”

Despite the relatively stagnant levels of funding for this issue—Menon says it’s not going to be easy to boost climate funding above 2 percent of philanthropic spending—some new foundations and donors are starting to give money. The Quadrature Climate Foundation, started in 2019 by the founders of a hedge fund, has pledged $100 million a year to fight climate change. And other large announcements by some donors this year are encouraging.

More donors are approaching Active Philanthropy seeking ways to integrate climate into their existing programs, Lundershausen says. Donors that support work on agriculture or economic transformation are looking for ways to include climate change in their approach to those issues, he says. That might also be a way for grant makers in the United States to focus more on , Buchanan says. They won’t have to change everything they do to address climate, just make it a factor in the work they already support on education, hunger, or other issues.

The increased attention and interest is encouraging, says Menon: “We’re really excited to see new foundations coming on board.”

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Chip-based optical tweezers levitate nanoparticles in a vacuum

Hexbyte Glen Cove

Schematic of optically levitating a nanoparticle with a metalens in a vacuum. The metalens focuses a laser beam to create chip-based optical tweezers. Credit: Kunhong Shen, Purdue University

Researchers have created tiny chip-based optical tweezers that can be used to optically levitate nanoparticles in a vacuum. Optical tweezers—which employ a tightly focused laser beam to hold living cells, nanoparticles and other objects—can be used for a variety of precision measurements and sensing applications. However, these optical traps are usually produced with bulky optical components.

“By using an ultrathin metalens, we reduced the diameter of the focusing from about 25 mm to about 0.4 mm,” said research team leader Tongcang Li from Purdue University. “The chip-based design can be used to create an integrated and flexible optical system for studying near-surface forces by trapping an object less than 1 micrometer away from a surface. It might also be useful for trapping cold atoms in a to study quantum processes.”

In Optica journal, researchers at Purdue University and Pennsylvania State University report the first realization of on-chip optical levitation in a vacuum with an ultrathin metalens. Accomplishing this feat in a vacuum helps improve the sensitivity of the system.

“Optically levitated particles can be used to create accelerometers and gyroscopes that could potentially be used in navigation,” said Li. “Scientists are also using optically levitated particles to search for and and to study gravity at , which will deepen our understanding of nature.”

Researchers used a metalens to create chip-based optical tweezers that can optically levitate nanoparticles in a vacuum. The diameter of the metalens was about 50 times smaller than that of a conventional objective lens that they used before. An optical image of the metalens is shown. Credit: Tongcang Li, Purdue University

Toward a portable trap

This new research grows out of previous work in which the researchers used optical levitation in a vacuum to create the fastest human-made rotor and the most sensitive torque detector ever reported.

“As a next step, we wanted to make optical levitation technology more practical by minimizing the system enough to make it portable,” said Li. “We began by reducing the size of the focusing lens by using a metalens, a type of flat lens that uses nanostructures to focus light.”

In the new work, the researchers designed a metalens consisting of thousands of silicon nanopillars. The diameter of the metalens was about 50 times smaller than that of the conventional objective lens that they used before.

“Other research groups have recently demonstrated metalens-based optical trapping in liquids,” said Kunhong Shen, the first author of this work. “Although performing optical trapping in a vacuum helps minimize noise from liquid or air, it is also much more difficult to do.”







The researchers showed that a levitating nanoparticle could be transferred between two separate optical traps. Credit: Kunhong Shen, Purdue University

Levitating with a flat lens

To test their new optical design, the researchers guided an intense onto the metalens to generate trapping forces. They then sprayed a diluted nanoparticle solution into the trapping area. When a nanoparticle becomes trapped, it will appear as a bright spot that can be observed with a camera. Photon detectors measured the nanoparticle’s movement in real time.

They showed that the metalens could levitate a nanoparticle in a vacuum at a pressure of 2×10-4 Torr—about 1/4,000,000 atmospheric pressure—without requiring any feedback stabilization. They were also able to transfer a levitating nanoparticle between two separate .

“Our metalens is a nanostructure layer with a thickness of merely 500 nm and a large numerical aperture of about 0.9. It offers similar performance as a conventional bulky lens,” said research team leader Xingjie Ni from the Pennsylvania State University. “The metalens is fully vacuum-compatible. And more interestingly, we can flexibly design it to perform additional functions, for example, filtering out low-spatial-frequency components from the focusing light, which we have proven to be beneficial for optical levitation of .”

The researchers are now working to improve the tiny levitation devices by boosting the transmission and focusing efficiency of the metalens. They also want to make the diameter of the metalens even smaller to make optical levitation more practical for real-world applications.



More information:
Kunhong Shen et al, On-chip optical levitation with a metalens in vacuum, Optica (2021). DOI: 10.1364/OPTICA.438410

Citation:
Chip-based optical tweezers levitate nanoparticles in a vacuum (2021, October 21)
retrieved 21 October 2021
from https://phys.org/news/2021-10-chip-based-optical-tweezers-levitate-nanoparticles.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove New MOND theory able to account for cosmic microwave background

Hexbyte Glen Cove

The linear MPS P(k) for the models showing excellent fits to the Sloan Digital Sky Survey (SDSS) data release 7 (DR7) luminous red galaxies (LRG) [105]. We also include a bias parameter b. Note that the (derived) Hubble constant for each model is different. Credit: DOI: 10.1103/PhysRevLett.127.161302

A pair of researchers at the Czech Academy of Sciences has been shaking up the astrophysics community with a new modified Newtonian dynamics (MOND) theory that tosses out the concept of dark matter and instead backs up theories that suggest there is a type of as-yet undiscovered gravity responsible for attributes seen in the cosmic microwave background (CMB). Constantinos Skordis and Tom Zlosnik have published a paper describing their ideas in Physical Review Letters.

For several years, many in the astrophysics community have used the idea of dark matter to explain phenomena that could not be explained in conventional ways—gravitational lensing, for example, or gravitational wave speed measurements found in the CMB. Others have been less willing to accept dark matter because of the lack of physical evidence of its existence. Some have suggested instead that there might be another kind of gravity at work that might explain such observations. These MOND theories have not carried much weight, however, because they cannot account for the features seen in the CMB. That appears to have changed as Skordis and Zlosnik are claiming that they have built a MOND model that does account for such data, and still accounts for gravitational lensing.

The new model begins by using the original MOND idea of two types of fields that behave together as a type of gravitational force—one is scalar, the other vector-based. Next, the researchers added parameters that suggested gravity-modifying fields generated in the —ones that mimicked dark . These fields, they further suggest, evolved over time until they became the type of force described by the original MOND model.

Skordis and Zlosnik suggest their model can be used to explain both and data in the CMB. Up next, they plan to find out if it can also explain the universe’s lithium abundance and discrepancies in measurements of the cosmic expansion rate—something theories have failed to do. They also note their ideas are also free of ghost instabilities.



More information:
Constantinos Skordis et al, New Relativistic Theory for Modified Newtonian Dynamics, Physical Review Letters (2021). DOI: 10.1103/PhysRevLett.127.161302

© 2021 Science X Network

Citation:
New MOND theory able to account for cosmic microwave background (2021, October 21)
retrieved 21 October 2021
from https://phys.org/news/2021-10-mond-theory-account-cosmic-microwave.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove DNA shows Japanese wolf closest relative of domestic dogs

Hexbyte Glen Cove

Stuffed specimen of Honshu wolf (Japanese Wolf, Canis hodophilax). Exhibit in the National Museum of Nature and Science, Tokyo, Japan. Credit: Momotarou2012/Wikimedia Commons, CC BY-SA 3.0

A team of researchers affiliated with several entities in Japan has found evidence that the Japanese wolf is the closest known relative of domestic dogs. The team has published a paper describing their genetic analysis of the extinct wolf and its relationship with modern dogs.

The Japanese is a subspecies of the gray wolf and once lived on many of the islands of what is now Japan. The was declared extinct in 1905 after hunters and landowners killed them off, but many tissue and bone samples were preserved. In this new effort, the researchers extracted DNA from tissue in bone samples from several museums in Japan.

By comparing the DNA of the Japanese wolf with the DNA of other and and species such as foxes, the researchers found that it resides on a unique evolutionary branch of wolves—one that arose sometime between 20,000 and 40,000 years ago. They also noted that some of those ancient wolves evolved into Japanese wolves and others evolved into dogs.

Prior research has shown that modern domestic dogs evolved from a type of gray wolf that does not exist today. This new work suggests that scientists are getting closer to learning more about that unique wolf. The new DNA evidence suggests that it lived in East Asia (not the Middle East or Europe as has been widely suggested) and its wolf line migrated later to Japan. It is still unclear, however, what happened to the line that evolved into dogs. The DNA also showed that there was some interbreeding between the wolf line and the dog line. A prior study has shown that approximately 2% of the DNA from a sled dog that died 10,000 years ago was from the Japanese wolf. The researchers suggest such interbreeding appears to have occurred prior to the Japanese wolf making its way to Japan; thus, it does not appear likely that dogs made their way there until much later. They also note that New Guinea singing dogs and dingoes have the highest amount of Japanese wolf DNA of any modern species, suggesting the wolf migrated great distances.



More information:
Jun Gojobori et al, The Japanese wolf is most closely related to modern dogs and its ancestral genome has been widely inherited by dogs throughout East Eurasia, biorxiv (2021). DOI: 10.1101/2021.10.10.463851

© 2021 Science X Network

Citation:
DNA shows Japanese wolf closest relative of domestic dogs (2021, October 21)
retrieved 21 October 2021
from https://phys.org/news/2021-10-dna-japanese-wolf-closest-relative.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Going off the rails: Research reveals ecological impact of rail transport on UK bat species

Hexbyte Glen Cove

Credit: University of Sussex

New research from the University of Sussex has revealed the ecological impact of rail transport on bats in the UK, throwing light on a previously unstudied area.

Globally, rail traffic is predicted to increase by a third by 2050. In the UK, new railways such as HS2 and Cross-Rail are being constructed, and there is increasing use of the existing network for both passenger and . Whilst rail offers significant environmental benefits compared with other transport, there has been little research into its impact on wildlife. Existing studies have tended to focus on fatalities caused by train collisions rather than non-lethal effects caused by sound and light pollution.

But in a new paper published in Scientific Reports, Professor Fiona Mathews and Dr. Paul Jerem from the University of Sussex show that the activity of two UK bat species is significantly disrupted by passing trains.

Fiona Mathews, Professor of Environmental Biology at the University of Sussex, said that “railway verges can be attractive environments for bats. The lines often provide ‘green corridors’ through otherwise inhospitable environments, such as urban centers or intensively managed agricultural land, and activity of bats is generally high along them. However, railway lines are becoming increasingly busy at night.”

“Our study reveals that this rail traffic is likely to make these areas unsuitable for up to two thirds of the night.”

Using ultrasonic detectors at 12 wooded rail-side sites in southern England, the researchers monitored the common and soprano pipistrelle bats, which are abundant in Europe and both frequently use habitats associated with transport infrastructure.

They found that the activity of these two species fell by more than 30-50% each time a train passed, and it took at least two minutes for activity to recover. The more frequently the trains passed, the greater the total amount of disturbance caused.

Prof Mathews and Dr. Jerem suspect the bats are showing avoidance of the sound created by the trains—as well as the physical object of the train itself.

Dr. Jerem, now at Vrije Universiteit Amsterdam, said that “taking evasive action every time a train passes may be energetically expensive, and reduce feeding opportunities, potentially disadvantaging bats using rail-side habitats”.

The occasional passing train is unlikely to present a problem for bat conservation. However, on many railway lines, the disturbance now occurs every three or four minutes. Every time a train passes, the bats have to take aversive action for two minutes, at huge energetic cost.

Globally, bats play an essential role in pest control, pollinating plants and dispersing seeds. In the UK, they frequently interact with transport networks due their broad distribution.

The researchers are now calling for greater consideration of wildlife when it comes to public transport.

Prof Mathews added that “railways are one of the most efficient ways of transporting people and freight, and with the climate crisis, people, as well as companies who need to transport goods overnight, will be turning to with one eye on the planet.”

“But at the moment, no consideration is given to the ecological impacts of increased railway traffic. We’d like to see this change and appropriate mitigation could be as simple as setting aside larger buffer areas adjacent to railways, reducing the impact on —and possibly many other species.”



More information:
Paul Jerem et al, Passing rail traffic reduces bat activity, Scientific Reports (2021). DOI: 10.1038/s41598-021-00101-3

Citation:
Going off the rails: Research reveals ecological impact of rail transport on UK bat species (2021, October 20)
retrieved 20 October 2021

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Zapping untreated water gets rid of more waterborne viruses

Hexbyte Glen Cove

Transmission electron micrographs of a nonenveloped virus, MS2 bacteriophage, before electrocoagulation (left) and after (right). Credit: Texas A&M Engineering/Shankar Chellam and Anindito Sen

Using sophisticated microscopy and computational analysis, Texas A&M University researchers have now validated the merit of a water purification technology that uses electricity to remove and inactivate an assortment of waterborne viruses. They said the yet-to-be-implemented water purification strategy could add another level of safety against pathogens that cause gastrointestinal ailments and other infections in humans.

“There is always a need for new techniques that are better, cheaper and more effective at safeguarding the public against disease-causing microorganisms,” said Shankar Chellam, professor in the Zachry Department of Civil and Environmental Engineering. “The technique investigated in this study is a promising strategy to kill even more viruses at the earliest stages of water purification.”

The researchers have detailed their findings in the journal Environmental Science & Technology.

Before water reaches homes, it undergoes multiple purification steps, namely , sedimentation, filtration and disinfection. Conventional coagulation methods use chemicals to trigger the clumping of particles and microbes within untreated water. These aggregates can then be removed when they settle as sediments. While effective, Chellam noted that the chemicals used for coagulation could be very acidic, making their transport to treatment plants and storage a challenge.

Instead of chemicals-based coagulation, the researchers investigated if an up-and-coming coagulation method that uses electricity was as effective at removing microbes from water. In particular, they used a surrogate of a nonenveloped virus, called MS2 bacteriophage, for their study. Their choice of microbes was motivated by the fact that MS2 bacteriophage shares structural similarities with many nonenveloped viruses that can persist in the water after treatment and cause disease in humans.

For their experiments, the researchers inserted iron electrodes in a sample of untreated water laden with viruses. When they passed electrical currents, the anode oxidized, releasing into the solution. These ions combine with to produce hydroxyl radicals and also iron-rich precipitates. In the process, they found that as the iron precipitated, the virus attached to these clumps to form bigger aggregates, which could be easily removed from the water.

Next, they investigated if this process also inactivated the virus. But while the iron aggregates helped in capturing the virus, they presented a problem to picture inactivation using electron microscopy.

“Clumping causes a tremendous challenge because there is no easy way to isolate the virus from the iron-rich aggregates, making it difficult to visualize viral damage and analyze if electrocoagulation was the cause of the viral damage or the virus extraction from the iron-rich clumps,” said Anindito Sen, a research scientist at the Texas A&M Microscopy and Imaging Center.

To address this problem, Chellam’s graduate student Kyungho Kim with Sen’s guidance developed a novel computational technique to directly image the viruses aggregated alongside iron. Put briefly, they digitally inflicted damages on 3D images of an intact MS2 bacteriophage. Then they generated 2D versions of the damaged 3D model. Last, they compared these images with 2D microscopic images of the obtained after electrocoagulation.

The researchers found that the damage in electrocoagulated viruses ranged from 10% to greater than 60%. Further, by following the same analysis with bacteriophages isolated from conventional coagulation, they observed that the viruses were not inactivated.

“The traditional multistep process of water purification has been there to ensure that even if one step fails, the subsequent ones can bail you out—a multiple barrier approach, so to speak,” Chellam said. “What we are proposing with electrocoagulation is process intensification, where coagulation and disinfection are combined within a single step before subsequent purification stages, to ensure better protection against waterborne pathogens.”



More information:
Kyungho Kim et al, Virus Removal and Inactivation Mechanisms during Iron Electrocoagulation: Capsid and Genome Damages and Electro-Fenton Reactions, Environmental Science & Technology (2021). DOI: 10.1021/acs.est.0c04438

Citation:
Zapping untreated water gets rid of more waterborne viruses (2021, October 20)
retrieved 20 October 2021
from https://phys.org/news/2021-10-zapping-untreated-waterborne-viruses.html

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —