Hexbyte Glen Cove Nature-inspired coatings could power tiny chemistry labs for medical testing and more

Hexbyte Glen Cove

A newly developed coating that allows for certain liquids to move across surfaces without fluid loss could usher in new advances in a range of fields, including medical testing.

This new coating—created in the DREAM (Durable Repellent Engineered Advanced Materials) Laboratory, led by University of Toronto Engineering Professor Kevin Golovin—was inspired by the natural world.

“Nature has already developed strategies to transport liquids across surfaces in order to survive,” says Mohammad Soltani, researcher in the DREAM Laboratory and lead author of a new paper recently published in Advanced Functional Materials.

“We were inspired by the structural model of natural materials such as cactus leaves or spider silk. Our can directionally transport not only , but also low tension liquids that easily spread on most surfaces.”

The innovation has important implications for microfluidics, a field where small quantities of liquids are transported within tiny channels, often less than a millimetre wide. This technique has many applications, one of them being to miniaturize the standard analytical tests that are currently preformed in chemical laboratories.







New polymer coatings, developed by Professor Kevin Golovin and his team at the University of Toronto, show the precision with which liquids can move across surfaces. Credit: Mohammad Soltani / University of Toronto Engineering

By reducing the quantity of sample and reagents required, and automating protocols for working with them, microfluidics can power lab-on-a-chip devices that offer fast, inexpensive medical tests. Proponents hope this could lead to diagnosing multiple conditions in minutes using only a drop or two of blood.

But current microfluidic devices have a key limitation: they can only effectively handle liquids with high surface tension, such as water. This property, also known as cohesion, means that the has a greater tendency to stick to itself than to the sides of the channel it is being transported through.

High surface tension liquids form discrete droplets that can be moved around independently, like raindrops on window glass. Cohesion can even be exploited to pull the liquid droplets along the channel through a process known as .

By contrast, low surface tension liquids, such as alcohols and other solvents, tend to stick to the sides of the channels, and can currently be transported for only about 10 millimetres before the droplet disintegrates. Capillary action no longer applies, so this transport requires an external force, such as magnetism or heat, to move the droplets.

The new coating enables low surface tension liquids to be transported over distances of over 150 millimetres without losing any of the liquid, about 15 times longer than currently possible.







Credit: University of Toronto

The technology uses two newly developed polymer coatings, one of which is more liquid-repellent than the other. Both are composed of liquid-like polymer brushes. The more repellent coating acts as a background, surrounding the less repellent and creating tiny channels along the surface. The channels allow for the liquids to move in a desired pattern or direction without losing any of the liquid during transport or requiring additional energy input.

“Polymer brush coatings reduce the fluid friction and allow the droplets to be transported passively,” says Soltani, “Less friction means more energy is available to transport the liquid. We then create a driving force by designing the channels with specific patterns.”

The ability to transport low surface tension liquids without loss will allow for advancements in lab-on-a-chip devices. Using these unique coatings, researchers have the ability to transport liquids over a longer range, move multiple liquids at the same time along a precise pathway and even merge and split droplets—all without losing any volume or experiencing cross-contamination.

This technology will also help limit waste in research labs. With no residue left behind on the surface of the device and therefore no risk of cross-contamination, researchers can use the same devices over and over again.

“We’re looking at using this technology for microfluidics bioassays, as this is an area where every drop of liquid is precious,” says Golovin. “Our findings also have great potential to advance point-of-care diagnostics, such as liver or kidney disease, where biological marker screening is often performed in non-aqueous media.”



More information:
Mohammad Soltani et al, Lossless, Passive Transportation of Low Surface Tension Liquids Induced by Patterned Omniphobic Liquidlike Polymer Brushes, Advanced Functional Materials (2021). DOI: 10.1002/adfm.202107465

Citation:
Nature-inspired coatings could power tiny chemistry labs for medical testing and more (2021, October 22)
retrieved 24 October 2021
from https://phys.org/news/2021-10-nature-inspired-coatings-power-tiny-chemistry.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Astronomers discover infant planet

Hexbyte Glen Cove

Discovery image of the planet 2M0437, which lies about 100 times the Earth-Sun distance from its parent star. The image was taken with the Subaru Telescope on Maunakea. The much-brighter host star has been mostly removed, and the four “spikes” are artifacts produced by the optics of the telescope. Credit: Subaru Telescope and Gaidos, et al. (2021)

One of the youngest planets ever found around a distant infant star has been discovered by an international team of scientists led by University of Hawaiʻi at Mānoa faculty, students, and alumni.

Thousands of planets have been discovered around other , but what sets this one apart is that it is newly-formed and can be directly observed. The planet, named 2M0437b, joins a handful of objects advancing our understanding of how planets form and change with time, helping shed new light on the origin of the Solar System and Earth. The in-depth research was recently published in Monthly Notices of the Royal Astronomical Society.

“This serendipitous discovery adds to an elite list of planets that we can directly observe with our telescopes,” explained lead author Eric Gaidos, a professor in the UH Mānoa Department of Earth Sciences. “By analyzing the light from this planet we can say something about its composition, and perhaps where and how it formed in a long-vanished disk of gas and dust around its host star.”

The researchers estimate that the planet is a few times more massive than Jupiter, and that it formed with its star several million years ago, around the time the main Hawaiian Islands first emerged above the ocean. The planet is so young that it is still hot from the energy released during its formation, with a temperature similar to the lava erupting from Kīlauea Volcano.

Subaru Telescope and Keck Observatory on Maunakea. Credit: University of Hawaii Institute for Astronomy

Key Maunakea telescopes

In 2018, 2M0437b was first seen with the Subaru Telescope on Maunakea by UH Institute for Astronomy (IfA) visiting researcher Teruyuki Hirano. For the past several years, it has been studied carefully utilizing other telescopes on the mauna.

Gaidos and his collaborators used the Keck Observatory on Maunakea to monitor the position of the as it moved across the sky, confirming that planet 2M0437b was truly a companion to the star, and not a more distant object. The observations required three years because the star moves slowly across the sky.

The planet and its parent star lie in a stellar “nursery” called the Taurus Cloud. 2M0437b is on a much wider orbit than the planets in the Solar System; its current separation is about one hundred times the Earth-Sun distance, making it easier to observe. However, sophisticated “adaptive” optics are still needed to compensate for the image distortion caused by Earth’s atmosphere.

“Two of the world’s largest telescopes, and Maunakea’s clear skies were all needed to make this discovery,” said co-author Michael Liu, an astronomer at IfA. “We are all looking forward to more such discoveries, and more detailed studies of such with the technologies and telescopes of the future.”

Future research potential

Gathering more in-depth research about the newly-discovered planet may not be too far away. “Observations with space telescopes such as NASA’s Hubble and the soon-to-be-launched James Webb Space Telescope could identify gases in its atmosphere and reveal whether the planet has a moon-forming disk,” Gaidos added.



More information:
Zodiacal Exoplanets in Time (ZEIT) XII: A Directly-Imaged Planetary-Mass Companion to a Young Taurus M Dwarf Star, Monthly Notices of the Royal Astronomical Society (2021). DOI: 10.1093/mnras/stab3069 , arxiv.org/abs/2110.08655

Citation:
Astronomers discover infant planet (2021, October 22)
retrieved 24 October 2021
from https://phys.org/news/2021-10-astronomers-infant-planet.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Machine learning predicts antibiotic resistance spread

Hexbyte Glen Cove

Credit: CC0 Public Domain

Genes aren’t only inherited through birth. Bacteria have the ability to pass genes to each other, or pick them up from their environment, through a process called horizonal gene transfer, which is a major culprit in the spread of antibiotic resistance.

Cornell researchers used machine learning to sort by their functions and use this information to predict with near-perfect accuracy how genes are transferred between them, an approach that could potentially be used to stop the spread of antibiotic resistance.

The team’s paper, “Functions Predict Horizontal Gene Transfer and the Emergence of Antibiotic Resistance,” published Oct. 22 in Science Advances. The lead author is doctoral student Hao Zhou.

“Organisms basically can acquire resistance genes from other organisms. And so it would help if we knew which organisms were exchanging with, and not only that, but we could figure out what are the driving factors that implicate organisms in this transfer,” said Ilana Brito, assistant professor and the Mong Family Sesquicentennial Faculty Fellow in Biomedical Engineering in the College of Engineering, and the paper’s senior author. “If we can figure out who is exchanging genes with who, then maybe it would give insight into how this actually happens and possibly even control these processes.”

Many novel traits are shared through gene transfer. But scientists haven’t been able to determine why some bacteria engage in gene transfer while others do not.

Instead of testing individual hypotheses, Brito’s team looked to bacteria genomes and their various functions—which can range from DNA replication to metabolizing carbohydrates—in order to identify signatures that indicate “who” were swapping genes and what was driving these networks of exchange.

Brito’s team used several , each of which teased out different phenomena embedded in the data. This enabled them to identify multiple networks of different antibiotic resistance genes, and across strains of the same organism.

For the study, the researchers focused on organisms associated with soil, plants and oceans, but their model is also well-suited to look at human-associated organisms and pathogens, such as Acinetobacter baumannii and E. coli, and within localized environments, such as an individual’s gut microbiome.

They found the machine-learning models were particularly effective when applied to antibiotic resistance genes.

“I think one of the big takeaways here is that the network of bacterial gene exchange—specifically for antibiotic resistance—is predictable,” Brito said. “We can understand it by looking at the data, and we can do better if we actually look at each organism’s genome. It’s not a .”

One of the most surprising findings was that the modeling predicted many possible antibiotic resistance transfers between human-associated bacteria and pathogens that haven’t yet been observed. These probable, yet undetected, transfer events were almost exclusive to human-associated bacteria in the or oral microbiome.

The research is emblematic of Cornell’s recently launched Center for Antimicrobial Resistance, according Brito, who serves on the center’s steering committee.

“One can imagine that if we can predict how these genes spread, we might be able to either intervene or choose a specific antibiotic, depending what we see in a patient’s gut,” Brito said. “More broadly, we may see where certain types of organisms are predicted to transfer with others in a certain environment. And we think there might be novel antibiotic targets in the data. For example, genes that could cripple these organisms, potentially, in terms of their ability to persist in certain environments or acquire these .”

Juan Felipe Beltrán, Ph.D. ’19, contributed to the research.



More information:
Hao Zhou et al, Functions predict horizontal gene transfer and the emergence of antibiotic resistance, Science Advances (2021). DOI: 10.1126/sciadv.abj5056. www.science.org/doi/10.1126/sciadv.abj5056

Citation:
Machine learning predicts antibiotic resistance spread (2021, October 22)
retrieved 24 October 2021
from https://phys.org/news/2021-

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove US targeting Feb. 2022 to launch new lunar program Artemis

Hexbyte Glen Cove

A NASA employee holds the official Artemis mission patch at NASA Plum Brook Station in Sandusky, Ohio.

NASA is aiming to launch its uncrewed lunar mission Artemis 1 in February next year, the space agency said Friday, the first step in America’s plan to return humans to the Moon.

The agency had initially hoped to launch the by the end of this year, with astronauts set to walk on the Moon by 2024.

It achieved a major milestone Wednesday when it stacked the Orion crew capsule atop its Space Launch System megarocket, which now stands 322 feet (98 meters) tall inside the Vehicle Assembly Building at NASA Kennedy Space Center in Florida.

After further tests, it will be wheeled out to the launchpad for a “wet dress rehearsal” in January, with the first window for launch opening in February, officials told reporters on a call.

“The February launch period opens on the 12th and our last opportunity in February is on the 27th,” said Mike Sarafin, Artemis 1 mission manager. The next windows are in March and April.

These potential launch periods are dependent on orbital mechanics and the relative position of the Earth with respect to the Moon. The mission duration is expected to be four to six weeks.

NASA will also deploy small satellites, known as CubeSats, to perform experiments and technology demonstrations.

NASA achieved a major milestone when it stacked the Orion crew capsule atop its Space Launch System megarocket.

Artemis 2 is then scheduled for 2023 and Artemis 3 for the following year, when humans will walk on the Moon for the first time since 1972. Both missions are now likely to be pushed back, however.

NASA says the moonwalkers will include the first woman and first person of color to make the trip.

The is seeking to establish a sustainable presence on the Moon as well as use the lessons it learns to plan a crewed trip to Mars in the 2030s.

Selfies with the Moon

Orion first flew into in 2014, launched by a Delta IV rocket, making two circumnavigations of Earth and testing its heat shield on re-entry into the atmosphere.

But this time, NASA’s Sarafin said, the capsule will travel much faster and the temperature will be much higher when it returns from the Moon.

Artemis 1 has several objectives: to demonstrate Orion’s ability to return from the Moon and operate in where it is “much colder than in low Earth orbit,” and to successfully recover the spacecraft, he added.

The mission also plans to study radiation and take selfies of the capsule with the Moon in the background.



© 2021 AFP

Citation:
US targeting Feb. 2022 to launch new lunar program Artemis (2021, October 23)
retrieved 23 October 2021
from https://phys.org/news/2021-10-feb-lunar-artemis.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Big tech data centers spark worry over scarce Western water

Hexbyte Glen Cove

In this Tuesday, Oct. 5, 2021, photo, The Dalles Mayor Richard Mays looks at the view of his town and the Columbia River from his hilltop home in The Dalles, Oregon. Mays helped negotiate a proposal by Google to build new data centers in the town. The data centers require a lot of water to cool their servers, and would use groundwater and surface water, but not any water from the Columbia River. Credit: AP Photo/Andrew Selsky

Conflicts over water are as old as history itself, but the massive Google data centers on the edge of this Oregon town on the Columbia River represent an emerging 21st century concern.

Now a critical part of modern computing, data centers help people stream movies on Netflix, conduct transactions on PayPal, post updates on Facebook, store trillions of photos and more. But a single facility can also churn through millions of gallons of water per day to keep hot-running equipment cool.

Google wants to build at least two more data centers in The Dalles, worrying some residents who fear there eventually won’t be enough water for everyone—including for area farms and fruit orchards, which are by far the biggest users.

Across the United States, there has been some mild pushback as tech companies build and expand data centers—conflicts likely to grow as water becomes a more precious resource amid the threat of climate change and as the demand for cloud computing grows. Some tech giants have been using cutting-edge research and development to find less impactful cooling methods, but there are those who say the companies can still do more to be environmentally sustainable.

The concerns are understandable in The Dalles, the seat of Wasco County, which is suffering extreme and exceptional drought, according to the U.S. Drought Monitor. The region last summer endured its hottest days on record, reaching 118 degrees Fahrenheit (48 Celsius) in The Dalles.

This Tuesday, Oct. 5, 2021, photo, shows the historic Granada Theater building in downtown The Dalles, Oregon. As demand for cloud computing grows, the world’s biggest tech companies are building more data centers, including in arid regions even though they use vast amounts of water per day. Some residents of The Dalles, Oregon, are objecting to a proposal by Google to build more data centers there, fearing that, amid rising temperatures and drought, there won’t be enough water for everyone. Credit: AP Photo/Andrew Selsky

The Dalles is adjacent to the the mighty Columbia River, but the new data centers wouldn’t be able to use that water and instead would have to take water from rivers and groundwater that has gone through the city’s water treatment plant.

However, the snowpack in the nearby Cascade Range that feeds the aquifers varies wildly year-to-year and glaciers are melting. Most aquifers in north-central Oregon are declining, according to the U.S. Geological Survey Groundwater Resources Program.

Adding to the unease: The 15,000 town residents don’t know how much water the proposed data centers will use, because Google calls it a trade secret. Even the town councilors, who are scheduled to vote on the proposal on Nov. 8, had to wait until this week to find out.

Dave Anderson, public works director for The Dalles, said Google obtained the rights to 3.9 million gallons of water per day when it purchased land formerly home to an aluminum smelter. Google is requesting less water for the new data centers than that amount and would transfer those rights to the city, Anderson said.

In this Tuesday, Oct. 5, 2021, photo, people walk in the streets undergoing a renewal project to fix decayed buildings downtown The Dalles, Oregon. As demand for cloud computing grows, the world’s biggest tech companies are building more data centers, including in arid regions even though they use vast amounts of water per day. Some residents of The Dalles, Oregon, are objecting to a proposal by Google to build more data centers there, fearing that, amid rising temperatures and drought, there won’t be enough water for everyone. Credit: AP Photo/Andrew Selsky

“The city comes out ahead,” he said.

For its part, Google said it’s “committed to the long-term health of the county’s economy and natural resources.”

“We’re excited that we’re continuing conversations with local officials on an agreement that allows us to keep growing while also supporting the community,” Google said, adding that the expansion proposal includes a potential aquifer program to store water and increase supply during drier periods.

The U.S. hosts 30% of the world’s data centers, more than any other country. Some data centers are trying to become more efficient in water consumption, for example by recycling the same water several times through a center before discharging it. Google even uses treated sewage water, instead of using drinking water as many data centers do, to cool its facility in Douglas County, Georgia.

In this Tuesday, Oct. 5, 2021, photo, Dawn Rasmussen stands at her well at her property on the outskirts of The Dalles, Oregon. She says the water table that her well draws from has dropped 15 feet in the last 15 years. She has deep concerns about Google’s proposal to build more data centers, which use vast amounts of water, in the town. The city council is expected to vote soon on Google’s proposal. As demand for cloud computing grows, the world’s biggest tech companies are building more data centers, including in arid regions even though they use vast amounts of water per day. Credit: AP Photo/Andrew Selsky

Facebook’s first data center took advantage of the cold high-desert air in Prineville, Oregon, to chill its servers, and went a step further when it built a center in Lulea, Sweden, near the Arctic Circle.

Microsoft even placed a small data center, enclosed in what looks like a giant cigar, on the seafloor off Scotland. After retrieving the barnacle-encrusted container last year after two years, company employees saw improvement in overall reliability because the servers weren’t subjected to temperature fluctuations and corrosion from oxygen and humidity. Team leader Ben Cutler said the experiment shows data centers can be kept cool without tapping freshwater resources.

A study published in May by researchers at Virginia Tech and Lawrence Berkeley National Laboratory showed one-fifth of data centers rely on water from moderately to highly stressed watersheds.

Tech companies typically consider tax breaks and availability of cheap electricity and land when placing data centers, said study co-author Landon Marston, assistant professor of civil and environmental engineering at Virginia Tech.

In this Tuesday, Oct. 5, 2021, photo, shows the exterior of a Google data center in The Dalles, Oregon. The Dalles City Council member Long-Curtiss wants to know more details about Google’s proposal to build more data centers in the town before the city council votes on the matter. As demand for cloud computing grows, the world’s biggest tech companies are building more data centers, including in arid regions even though they use vast amounts of water per day. Some residents of The Dalles, Oregon, are objecting to a proposal by Google to build more data centers there, fearing that, amid rising temperatures and drought, there won’t be enough water for everyone. Credit: AP Photo/Andrew Selsky

They need to consider water impacts more seriously, and put the facilities in regions where they can be better sustained, both for the good of the environment and their own bottom line, Marston said.

“It’s also a risk and resilience issue that data centers and their operators need to face, because the drought that we’re seeing in the West is expected to get worse,” Marston said.

About an hour’s drive east of The Dalles, Amazon is giving back some of the water its massive data centers use. Amazon’s sprawling campuses, spread between Boardman and Umatilla, Oregon, butt up against farmland, a cheese factory and neighborhoods. Like many data centers, they use water primarily in summer, with the servers being air-cooled the rest of the year.

About two-thirds of the water Amazon uses evaporates. The rest is treated and sent to irrigation canals that feed crops and pastures.

In this Tuesday, Oct. 5, 2021, photo, shows the exterior of a Google data center in The Dalles, Oregon. The Dalles City Council member Long-Curtiss wants to know more details about Google’s proposal to build more data centers in the town before the city council votes on the matter. Credit: AP Photo/Andrew Selsky

Umatilla City Manager Dave Stockdale appreciates that farms and ranches are getting that water, since the main issue the city had as Amazon’s facilities grew was that the city water treatment plant couldn’t have handled the data centers’ discharge.

John DeVoe, executive director of WaterWatch of Oregon, which seeks reform of water laws to protect and restore rivers, criticized it as a “corporate feel good tactic.”

“Does it actually mitigate for any harm of the server farm’s actual use of water on other interests who may also be using the same source water, like the environment, fish and wildlife?” DeVoe said.

Adam Selipsky, CEO of Amazon Web Services, insists that Amazon feels a sense of responsibility for its impacts.

“We have intentionally been very conscious about water usage in any of these projects,” he said, adding that the centers brought economic activity and jobs to the region.

  • This Tuesday, Oct. 5, 2021, photo, shows the land beyond the fending where Google wants to build more data centers at the site of a former aluminum smelter in The Dalles, Oregon. The Dalles City Council member Long-Curtiss wants to know more details about Google’s proposal to build more data centers in the town before the city council votes on the matter. As demand for cloud computing grows, the world’s biggest tech companies are building more data centers, including in arid regions even though they use vast amounts of water per day. Some residents of The Dalles, Oregon, are objecting to a proposal by Google to build more data centers there, fearing that, amid rising temperatures and drought, there won’t be enough water for everyone. Credit: AP Photo/Andrew Selsky
  • In this Tuesday, Oct. 5, 2021, photo, The Dalles city Public Works Department Director Dave Anderson, left, and Mayor Richard Mays pose for photos with the Columbia River in The Dalles, Oregon. Mays helped negotiate a proposal by Google to build new data centers in the town. The data centers require a lot of water to cool their servers, and would use groundwater and surface water, but not any water from the Columbia River. Credit: AP Photo/Andrew Selsky
  • In this Tuesday, Oct. 5, 2021, photo, The Dalles city Public Works Department Director Dave Anderson, left, and Mayor Richard Mays pose for photos in The Dalles, Oregon. Mays helped negotiate a proposal by Google to build new data centers in the town. The data centers require a lot of water to cool their servers, and would use groundwater and surface water, but not any water from the Columbia River. Credit: AP Photo/Andrew Selsky

Dawn Rasmussen, who lives on the outskirts of The Dalles, worries that her town is making a mistake in negotiating with Google, likening it to David versus Goliath.

She’s seen the level of her well-water drop year after year and worries sooner or later there won’t be enough for everyone.

“At the end of the day, if there’s not enough water, who’s going to win?” she asked.



© 2021 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.

Citation:
Big tech data centers spark worry over scarce Western water (2021, October 22)
retrieved 23 October 2021
from https://phys.org/news/2021-10-big-tech-centers-scarce-western.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove NASA targeting Feb. 2022 to launch new lunar program Artemis

Hexbyte Glen Cove

A NASA employee holds the official Artemis mission patch at NASA Plum Brook Station in Sandusky, Ohio.

NASA said Friday it is now targeting February 2022 for the uncrewed lunar mission Artemis 1, the first step in America’s plan to return humans to the Moon later this decade.

The had initially wanted to launch the by the end of this year, with astronauts on the ground by 2024 on Artemis 3, but the timeline has slipped back.

It achieved a major milestone Wednesday when it stacked the Orion crew capsule atop its Space Launch System megarocket, which now stands 322 feet (98 meters) tall inside the Vehicle Assembly Building at NASA Kennedy Space Center in Florida.

After further tests, it will be wheeled out to the for a final test known as the “wet dress rehearsal” in January, with the first window for launch opening in February, officials told reporters on a call.

“The February launch period opens on the 12th and our last opportunity in February is on the 27th,” said Mike Sarafin, Artemis 1 manager.

The next windows are in March and then April.

These potential launch periods are dependent on orbital mechanics and the relative position of the Earth with respect to its .

The mission duration is expected to be four to six weeks.

It will also deploy a number of small satellites, known as CubeSats, to perform experiments and technology demonstrations.

Although likely to be pushed back, Artemis 2 is technically scheduled for 2023 and Artemis 3 for 2024, humanity’s return to the Moon for the first time since the Apollo 17 mission in 1972.

NASA says the moonwalkers will include the first woman and first person of color to make the trip.

The agency is seeking to establish a sustainable presence on the Moon, and use the lessons it learns to plan a crewed trip to Mars in the 2030s.



© 2021 AFP

Citation:
NASA targeting Feb. 2022 to launch new lunar program Artemis (2021, October 22)
retrieved 23 October 2021
from https://phys.org/news/2021-10-nasa-feb-lunar-artemis.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove NASA launches tool that measures Western water loss

Hexbyte Glen Cove

Credit: Pixabay/CC0 Public Domain

NASA on Thursday launched an online platform with information on how much water evaporates into the atmosphere from plants, soils and other surfaces in the U.S. West, data it says could help water managers, farmers and state officials better manage resources in the parched region.

The platform, OpenET, uses satellite imagery from the Landsat program, a decades-long project of NASA and the U.S. Geological Survey that records human and natural impacts on Earth’s surface.

Specifically, it provides data for 17 Western states—down to the quarter-acre—on how much evapotranspiration has taken place. That’s the process by which moisture in leaves, soil and other surfaces evaporates into the air.

The West has been mired in drought for more than two decades. Scientists say human-caused climate change has intensified conditions. Water levels at key reservoirs on the Colorado River have fallen to historic lows alongside growing demand, prompting the federal government to declare for some states next year. A blazing summer and years of record-breaking wildfires have also zapped moisture from the ground.

Detailed information on soil moisture could help farmers and water managers better plan during dry conditions and reduce how much water is used for irrigation, NASA scientists said on a Thursday call with reporters.

“Farmers and have not had consistent, timely data on one of the most important pieces of information for managing water, which is the amount of water that’s consumed by crops and other plants as they grow,” said Robyn Grimm, a water specialist with the Environmental Defense Fund, which helped NASA develop the tool alongside other environmental groups and Google.

“To date, that data has been expensive and fragmented,” she said.

Many large farms in dry areas, such as California’s Central Valley, already have years of experience using advanced data systems to measure evapotranspiration and other water metrics that influence their growing and harvesting seasons and watering schedules.

Cannon Michael runs an 11,000-acre (4,452 hectare) farm in Merced County, California, that produces tomatoes, melons, cotton and alfalfa. Michael said he looked at NASA’s new platform, but didn’t think it would provide any additional benefit for his farm.

“We closely monitor and understand our water use,” he said. “Our farm is 75% , and we have a very detailed scheduling and forecasting process already in place.”

Meanwhile, Colorado rancher Joe Stanko in Steamboat Springs had read about the new tool in a magazine. Her family grows hay for their cattle, and she said the platform could help them determine which fields need more to replenish soil. It could also help them decide when to harvest hay.

NASA said the platform includes historical data dating back to 1984. In coming months, it will be updated to include information about precipitation rates with the same level of detail. Eventually, the tool will extend to other parts of the U.S., including areas around the Mississippi River and Appalachian region, scientists said.



© 2021 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.

Citation:
NASA launches tool that measures Western water loss (2021, October 22)

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove When and why did human brains decrease in size 3,000 years ago? Ants may hold clues

Hexbyte Glen Cove

Credit: Pixabay/CC0 Public Domain

The brain is the most complex organ in the human body. Now, a new study has brought us closer to understanding some of its evolution. It shows that human brains decreased in size approximately 3,000 years ago. By studying ants as models to illustrate why brains may increase or decrease in size, the researchers hypothesize that brain shrinkage parallels the expansion of collective intelligence in human societies.

Studying and understanding the causes and consequences of brain helps us understand the nature of humanity. It is well documented that have increased in size over the course of our evolutionary history. Less appreciated is the fact that human brains have decreased in size since the Pleistocene. When exactly these changes happened, or why, was not well known.

“A surprising fact about humans today is that our brains are smaller compared to the brains of our Pleistocene ancestors. Why our brains have reduced in size has been a big mystery for anthropologists,” explained co-author Dr. Jeremy DeSilva, from Dartmouth College.

To disentangle this mystery, a team of researchers from different academic fields set out to study the historical patterns of human brain evolution, comparing their findings with what is known in ant societies to offer broad insights.

“A biological anthropologist and a behavioral ecologist and evolutionary neurobiologist began sharing their thoughts on brain evolution and found bridging research on humans and ants might help identify what is possible in nature,” said co-author Dr. James Traniello, from Boston University.

Their paper, published in Frontiers in Ecology and Evolution, sheds new light on the evolution of our brain.

A recent size decrease

The researchers applied a change-point analysis to a dataset of 985 fossil and modern human crania. They found that human brains increased in size 2.1 million years ago and 1.5 million years ago, during the Pleistocene, but decreased in size around 3,000 years ago (Holocene), which is more recent than previous estimates.

“Most people are aware that humans have unusually large brains—significantly larger than predicted from our body size. In our deep evolutionary history, human brain size dramatically increased,” said Traniello. “The reduction in human brain size 3,000 years ago was unexpected.”

The timing of size increase coincides with what is previously known about the early evolution of Homo and the technical advancements that led to; for example, better diet and nutrition and larger social groups.

As for the decrease in brain size, the interdisciplinary team of researchers propose a new hypothesis, finding clues within ant societies.

What could ants teach us about human brain evolution?

“We propose that ants can provide diverse models to understand why brains may increase or decrease in size due to social life. Understanding why brains increase or decrease is difficult to study using only fossils,” explained Traniello.

Studying computational models and patterns of worker ant brain size, structure, and energy use in some ant clades, such as the Oecophylla weaver ant, Atta leafcutter ants, or the common garden ant Formica, showed that group-level cognition and division of labor may select for adaptive brain size variation. This means that within a social group where knowledge is shared or individuals are specialists at certain tasks, brains may adapt to become more efficient, such as decreasing in size.

“Ant and human societies are very different and have taken different routes in social evolution,” Traniello said. “Nevertheless, also share with humans important aspects of social life such as group decision-making and division of labor, as well as the production of their own food (agriculture). These similarities can broadly inform us of the factors that may influence changes in human brain size.”

Brains use up a lot of energy, and smaller brains use less energy. The externalization of knowledge in , thus needing less energy to store a lot of information as individuals, may have favored a decrease in size.

“We propose that this decrease was due to increased reliance on collective intelligence, the idea that a group of people is smarter than the smartest person in the group, often called the ‘wisdom of the crowds,'” added Traniello.

DeSilva concluded, “We look forward to having our hypothesis tested as additional data become available.”



More information:
Jeremy DeSilva et al, When and Why Did Human Brains Decrease in Size? A New Change-Point Analysis and Insights from Brain Evolution in Ants, Frontiers in Ecology and Evolution (2021). DOI: 10.3389/fevo.2021.742639

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Study: Only 2% of global giving goes to curb climate change

Hexbyte Glen Cove

Credit: CC0 Public Domain

Global philanthropic spending to help halt climate change grew last year—but still remains less than 2% of all giving, according to a new report from the ClimateWorks Foundation.

That’s a really low number, especially as the detrimental effects of change have become clearer, says Surabi Menon, vice president of global intelligence at ClimateWorks and a co-author of the report. “With the extreme events taking place, we thought it was shocking that there was so little money.”

Giving by foundations and individual donors to mitigate climate change grew by 14% in 2020, according to the report, “Funding Trends 2021: Climate Change Mitigation Philanthropy.”

In 2019, the group estimates that $5 billion to $9 billion went to curb climate change. Last year, it grew to $6 billion to $10 billion. That estimate does not include the nearly $800 million that the Bezos Earth Fund gave to climate-change efforts at the end of 2020 because the announcement came late in the year and the group did not have the data it needed to assess those gifts.

The failure to invest in efforts to fight the climate crisis is not due to a lack of interest from grant makers, says Phil Buchanan, president of the Center for Effective Philanthropy, a research and consulting organization that advises foundations on their grant making. In 2016, his group surveyed foundation leaders about what they saw as the biggest challenges facing society. Climate change was second only to inequality, he says.

The issue may be hard for foundations to address in part because so many grant makers focus their work on regional and local issues, he says. They may not see such a massive global problem as part of their mission or see how local efforts can have an impact on a complex international issue.

In addition, foundations often want to see quantifiable results where they can know that their support produced a particular result, Buchanan says. Climate change—a fraught global issue with very few efforts resulting in short-term identifiable change—does not lend itself well to this mind-set.

“There’s nothing good about the fact that we’re now seeing, with increasing regularity, the obvious impacts of climate change across communities,” Buchanan says. “But maybe that’s the wake-up call that will finally get more philanthropy unleashed toward doing something about it.”

Johannes Lundershausen, the climate knowledge lead at Active Philanthropy, a German organization that advises donors from around the world on how to effectively support efforts to curb climate change, acknowledges that the 2% figure is disappointing, especially given the need for profound changes in this decade.

He says there need to be more ways for donors to get involved without having to start from scratch and those that are already working on climate need to do a better job of collaborating. But, he says, because the issue is relatively new, it’s understandable that the total amount of support is small. “The trajectory is right,” he says. “There has been progress.”

European foundation funding for climate mitigation outside of Europe tripled from 2016 to 2020, according to a report from ClimateWorks, the European Foundation Centre, and the Hour Is Late, released last week.

Menon, too, sees interest increasing. This year, several large foundations have made pledges focused on climate change, including the $10 billion Earth Fund and the Protecting Our Planet Challenge, in which nine foundations pledged to spend $5 billion on conservation over a decade.

The report also found that donors who in 2018 at the Global Climate Action Summit pledged to give $4 billion to climate solutions are meeting those commitments and more have joined them, increasing the pledge to $6 billion to be given by 2025.

The bulk of the money given to climate causes was from individual donors. Foundations accounted for only $1.9 billion of climate-change donations in 2020, but that was up from about $900 million in 2015.

The report breaks down giving by sector and region in an effort to identify gaps in funding and areas in which donors can help spur progress.

In the past, a lot of donors have supported efforts to make the power sector greener and help it move away from coal and other carbon-based fuels, Menon says. But now, she says, there are real opportunities for philanthropy to boost electric transportation and curb greenhouse-gas emissions in the manufacturing and industrial sectors as well as in the food system, where methane gas, a particularly potent contributor to climate change, is an issue.

The report also discussed foundation support for racial-justice efforts that it said attracted more than $500 million in the United States in the first half of 2020. Indigenous people in the United States are 48% more likely to live in areas that are impacted by flooding from sea-level rise, according to an Environmental Protection Agency analysis cited in the report. The EPA also found that Latinos are 43% more likely to lose income due to increased heat and that mortality rates will increase for Black people due to climate change.

The report cited another study that found that in 2016 and 2017 just 1.3% of grants from 12 large environmental grant makers went to environmental-justice groups. But ClimateWorks did not break out funding for those groups in its own report.

“That was one piece we wanted to highlight this year, but we just don’t have the data,” Menon says. ClimateWorks doesn’t have enough information about the environmental-justice movement to accurately determine the amount of money it is getting and how it is being used, she says. “We’d rather work with another entity, other experts who can tell us how to look at the funding in that space and help us.”

Despite the relatively stagnant levels of funding for this issue—Menon says it’s not going to be easy to boost climate funding above 2 percent of philanthropic spending—some new foundations and donors are starting to give money. The Quadrature Climate Foundation, started in 2019 by the founders of a hedge fund, has pledged $100 million a year to fight climate change. And other large announcements by some donors this year are encouraging.

More donors are approaching Active Philanthropy seeking ways to integrate climate into their existing programs, Lundershausen says. Donors that support work on agriculture or economic transformation are looking for ways to include climate change in their approach to those issues, he says. That might also be a way for grant makers in the United States to focus more on , Buchanan says. They won’t have to change everything they do to address climate, just make it a factor in the work they already support on education, hunger, or other issues.

The increased attention and interest is encouraging, says Menon: “We’re really excited to see new foundations coming on board.”

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Chip-based optical tweezers levitate nanoparticles in a vacuum

Hexbyte Glen Cove

Schematic of optically levitating a nanoparticle with a metalens in a vacuum. The metalens focuses a laser beam to create chip-based optical tweezers. Credit: Kunhong Shen, Purdue University

Researchers have created tiny chip-based optical tweezers that can be used to optically levitate nanoparticles in a vacuum. Optical tweezers—which employ a tightly focused laser beam to hold living cells, nanoparticles and other objects—can be used for a variety of precision measurements and sensing applications. However, these optical traps are usually produced with bulky optical components.

“By using an ultrathin metalens, we reduced the diameter of the focusing from about 25 mm to about 0.4 mm,” said research team leader Tongcang Li from Purdue University. “The chip-based design can be used to create an integrated and flexible optical system for studying near-surface forces by trapping an object less than 1 micrometer away from a surface. It might also be useful for trapping cold atoms in a to study quantum processes.”

In Optica journal, researchers at Purdue University and Pennsylvania State University report the first realization of on-chip optical levitation in a vacuum with an ultrathin metalens. Accomplishing this feat in a vacuum helps improve the sensitivity of the system.

“Optically levitated particles can be used to create accelerometers and gyroscopes that could potentially be used in navigation,” said Li. “Scientists are also using optically levitated particles to search for and and to study gravity at , which will deepen our understanding of nature.”

Researchers used a metalens to create chip-based optical tweezers that can optically levitate nanoparticles in a vacuum. The diameter of the metalens was about 50 times smaller than that of a conventional objective lens that they used before. An optical image of the metalens is shown. Credit: Tongcang Li, Purdue University

Toward a portable trap

This new research grows out of previous work in which the researchers used optical levitation in a vacuum to create the fastest human-made rotor and the most sensitive torque detector ever reported.

“As a next step, we wanted to make optical levitation technology more practical by minimizing the system enough to make it portable,” said Li. “We began by reducing the size of the focusing lens by using a metalens, a type of flat lens that uses nanostructures to focus light.”

In the new work, the researchers designed a metalens consisting of thousands of silicon nanopillars. The diameter of the metalens was about 50 times smaller than that of the conventional objective lens that they used before.

“Other research groups have recently demonstrated metalens-based optical trapping in liquids,” said Kunhong Shen, the first author of this work. “Although performing optical trapping in a vacuum helps minimize noise from liquid or air, it is also much more difficult to do.”







The researchers showed that a levitating nanoparticle could be transferred between two separate optical traps. Credit: Kunhong Shen, Purdue University

Levitating with a flat lens

To test their new optical design, the researchers guided an intense onto the metalens to generate trapping forces. They then sprayed a diluted nanoparticle solution into the trapping area. When a nanoparticle becomes trapped, it will appear as a bright spot that can be observed with a camera. Photon detectors measured the nanoparticle’s movement in real time.

They showed that the metalens could levitate a nanoparticle in a vacuum at a pressure of 2×10-4 Torr—about 1/4,000,000 atmospheric pressure—without requiring any feedback stabilization. They were also able to transfer a levitating nanoparticle between two separate .

“Our metalens is a nanostructure layer with a thickness of merely 500 nm and a large numerical aperture of about 0.9. It offers similar performance as a conventional bulky lens,” said research team leader Xingjie Ni from the Pennsylvania State University. “The metalens is fully vacuum-compatible. And more interestingly, we can flexibly design it to perform additional functions, for example, filtering out low-spatial-frequency components from the focusing light, which we have proven to be beneficial for optical levitation of .”

The researchers are now working to improve the tiny levitation devices by boosting the transmission and focusing efficiency of the metalens. They also want to make the diameter of the metalens even smaller to make optical levitation more practical for real-world applications.



More information:
Kunhong Shen et al, On-chip optical levitation with a metalens in vacuum, Optica (2021). DOI: 10.1364/OPTICA.438410

Citation:
Chip-based optical tweezers levitate nanoparticles in a vacuum (2021, October 21)
retrieved 21 October 2021
from https://phys.org/news/2021-10-chip-based-optical-tweezers-levitate-nanoparticles.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —