Hexbyte Glen Cove Banishing bias across disciplines, genders and experience—new tool for fairer research metrics thumbnail

Hexbyte Glen Cove Banishing bias across disciplines, genders and experience—new tool for fairer research metrics

Hexbyte Glen Cove

Left panel: Distribution of within-discipline residuals of the relationship between Arel and loge years publishing by discipline (ARC = archaeology, CHM = chemistry, ECO = ecology, EVO = evolution and development, GEO = geology, MIC = microbiology, OPH = ophthalmology, PAL = palaeontology), each comprising 60 researchers (30 ♀, 30 ♂). Right panel: Distribution of among-discipline residuals. Credit: Professor Corey Bradshaw at Flinders University.

‘Where researchers are in their career, be they exciting newcomers or distinguished professors, can also have a strong effect on current metrics.

‘The Epsilon Index is a new way to reduce systemic biases in assessing researcher quality via citations by providing career-stage, gender, and opportunity corrections to citation-based performance metrics,’ he says.

The tool is freely available as a ready-made app—simply punch in a few data for a sample of researchers from open-source databases like Google Scholar, and it does the heavy lifting to produce the result, enabling comparison of researchers at any stage of their career and from any discipline on the same scale.

It’s a boon for anyone who wishes to use an objective metric to rank researchers, be it for grant applications, job interviews, promotions and awards, or even as a staff performance indicator.

Relationship between scaled citation mass and loge years publishing ( for 480 researchers in eight different disciplines (ARC = archaeology, CHM = chemistry, ECO = ecology, EVO = evolution and development, GEO = geology, MIC = microbiology, OPH = ophthalmology, PAL = palaeontology) comprising 60 researchers each. Credit: Professor Corey Bradshaw at Flinders University.

The approach to develop and test the ε-index was itself an exercise in multi-disciplinarity and specifically engineered to ensure gender balance.

The tool was tested and refined through collaboration on sample data with an archaeologist—Assistant Professor Dr. Stefani Crabtree (Utah State University/Santa Fe Institute), a geologist and vertebrate palaeontologist—Professor Kate Trinajstic (Curtin University), a chemist—Professor Justin Chalker, a microbiologist—Professor Bart Eijkelkamp, a palaeontologist—Professor John Long, an ophthalmologist—Professor Justine Smith, and an evolutionary biologist—Professor Vera Weisbecker (all Flinders University).

The sample comprised 480 researchers with Google Scholar profiles, stratified evenly into eight disciplines (archaeology, chemistry, ecology, evolution and development, geology, microbiology, ophthalmology, palaeontology), three career stages (early-, mid-, late-career), and two genders.

Professor Justine Smith, who’s been a ‘superstar of STEM’ actively promoting women’s engagement in science, technology, engineering and mathematics, hopes the ε-index will make a difference to how women in science are perceived, and in turn encourage more women to enter the sciences.

‘More accurately assessing the contribution of women to the sciences and celebrating their successes is pivotal to encouraging future generations of girls into the sciences; as the saying goes, “you can’t be what you can’t see”. The ε-index gives fairer and greater visibility to women’s achievements, and that it does so across a range of disciplines makes it especially beneficial,’ Professor Smith says.

Violin plots of scaled residuals and m-quotient across all eight disciplines relative to career stage (ECR = early career; MCR = mid-career; LCR = late career). Credit: Professor Corey Bradshaw

How it works

The new index is a ranking algorithm that can be standardised across disciplines, can be corrected for career breaks, and provides a sample-specific threshold that can determine whether individual performance is greater or less than expected relative to the other researchers in a sample.

Using either the R code or online app, it requires just four items of information from public databases such as Google Scholar or Scopus to calculate a researcher’s ε-index:

  • the number of citations acquired for the researcher’s top-cited paper
  • the i10-index (number of articles with at least 10 citations)
  • the h-index, and
  • the year in which the researcher’s first peer-reviewed paper was published.

The tool also provides a simple method to scale the index across disciplines with variable citation trends (ε′-index) to enable fairer comparison of researchers in different areas.

  • Median ranks among the eight disciplines examined (ARC = archaeology, CHM = chemistry, ECO = ecology, EVO = evolution and development, GEO = geology, MIC = microbiology, OPH = ophthalmology, PAL = palaeontology). Credit: Professor Corey Bradshaw
  • Ranks by gender and career stage (ECR = early career researcher, MCR = mid-career researcher, LCR = late-career researcher); (b) bootstrapped debiased (i.e., calculating the scaled residuals for each gender separately, and then ranking the combined dataset) ε′ ranks by gender and career stage. Credit: Professor Corey Bradshaw

The ε-index easily allows benchmarking of subsets of researchers into women-only or men-only to adjust the threshold such that the ranks are more comparable between these two genders. Alternatively, dividing the genders and benchmarking them separately followed by a combined re-ranking effectively removes the gender bias in the ε, which is difficult or impossible to do with other ranking metrics.

‘No ranking metric is perfect, but the ε-index is a big leap forward in overcoming bias and giving a clearer, fairer picture of research performance across the board,’ Professor Bradshaw says.



More information:
Corey J. A. Bradshaw et al, A fairer way to compare researchers at any career stage and in any discipline using open-access citation data, PLOS ONE (2021). DOI: 10.1371/journal.pone.0257141

Citation:
Banishing bias across disciplines, genders and experience—new tool for fairer research metrics (2021, September 10)
retrieved 11 September 2021
from https://phys.org/news/2021-09-banishing-bias-disciplines-genders-experiencenew.html

This document

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Research on beards, wads of gum wins 2021 Ig Nobel prizes thumbnail

Hexbyte Glen Cove Research on beards, wads of gum wins 2021 Ig Nobel prizes

Hexbyte Glen Cove

In this Sept. 12, 2019 file photo, the 2019 Ig Nobel award is displayed at the 29th annual Ig Nobel awards ceremony at Harvard University in Cambridge, Mass. The spoof prizes for weird and sometimes head-scratching scientific achievement will be presented online in 2021 due to the coronavirus pandemic. Credit: AP Photo/Elise Amendola, File

Beards aren’t just cool and trendy—they might also be an evolutionary development to help protect a man’s delicate facial bones from a punch to the face.

That’s the conclusion of a trio of scientists from the University of Utah who are among the winners of this year’s Ig Nobel prizes, the Nobel Prize spoofs that honor—or maybe dishonor, depending on your point of view—strange scientific discoveries.

The winners of the 31st annual Ig Nobels being announced Thursday included researchers who figured out how to better control cockroaches on U.S. Navy submarines; animal scientists who looked at whether it’s safer to transport an airborne rhinoceros upside-down; and a team that figured out just how disgusting that discarded gum stuck to your shoe is.

For the second year in a row, the ceremony was a roughly 90-minute prerecorded digital event because of the worldwide coronavirus pandemic, said Marc Abrahams, editor of the Annals of Improbable Research magazine, the event’s primary sponsor.

While disappointing in many ways because half the fun of a live ceremony is the rowdy audience participation, the ceremony retained many in-person traditions. Those included real Nobel laureates announcing the prizes, and the world premiere of a mini opera called “A Bridge Between People,” about children who literally build tiny suspension bridges to join two angry adults.

No faces were punched for the beard study published in the Integrative Organismal Biology.

In this Sept. 12, 2019 file photo, Shigeru Watanabe, of Japan, receives the Ig Nobel award in chemistry for estimating the total saliva volume produced per day by a typical five-year-old, at the 29th annual Ig Nobel awards ceremony at Harvard University, in Cambridge, Mass. The spoof prizes for weird and sometimes head-scratching scientific achievement will be presented online in 2021 due to the coronavirus pandemic. Credit: AP Photo/Elise Amendola, File

Instead, University of Utah scientists Ethan Beseris, Steven Naleway and David Carrier used a fiber epoxy composite to simulate human bone, and sheepskin to act as the —sometimes with the fleece still on, sometimes sheared. They then dropped weights on them.

The sample with the fleece still attached absorbed more energy than the sheared samples.

“If the same is true for human facial hair, then having a full beard may help protect vulnerable regions of the facial skeleton from damaging strikes, such as the jaw,” they said. “Presumably, full beards also reduce injury, laceration, and contusion to the skin and muscle of the face.”

It’s obvious that those wads of discarded chewing gum found on sidewalks around the world are pretty revolting.

But just how revolting?

Researchers from a Spanish university determined the already-chewed gum that has been stuck to the sidewalk for three months is teeming with nasty bacteria.

In this Sept. 12, 2019 file photo, audience members toss paper airplanes at the 29th annual Ig Nobel awards ceremony at Harvard University, in Cambridge, Mass. The spoof prizes for weird and sometimes head-scratching scientific achievement will be presented online in 2021 due to the coronavirus pandemic. Credit: AP Photo/Elise Amendola, File

It sounds like a silly study, but as usual, there was some method to the madness.

“Our findings have implications for a wide range of disciplines, including forensics, contagious disease control, or bioremediation of wasted chewing gum residues,” Leila Satari, Alba Guillén, Àngela Vidal-Verdú, and Manuel Porcar from the University of Valencia wrote in their paper, which was published at Nature.com.

A team of U.S. Navy researchers won for figuring out a cheaper and more effective way to control cockroaches on submarines. The 1971 study that appeared in Journal of Economic Entomology found that traditional methods such as carboxide fumigation and use of the pesticide malathion were not good enough.

They found that using the pesticide dichlorvos was less expensive and more effective.

The goal is to return next year’s ceremony to its traditional home at Harvard University’s Sanders Theatre, Abrahams said, but much depends on whether the pandemic is under control and what kind of travel restrictions are in place around the world.



© 2021 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.

Citation:
Research on beards, wads of gum wins 2021 Ig Nobel prizes (2021, September 9)
retrieved 9 September 2021
from https://phys.org/news/2021-09-beards-wads-gum-ig-nobel.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Biden administration targets 20% drop in aviation emissions thumbnail

Hexbyte Glen Cove Biden administration targets 20% drop in aviation emissions

Hexbyte Glen Cove

The Biden administration aims to reduce emissions from aviation fuel by 20 percent by shifting production away from conventional jet fuel.

US President Joe Biden’s adminstration on Thursday unveiled measures that aim to reduce aviation emissions by 20 percent by 2030 in its latest initiative to combat climate change.

Aviation accounts for 11 percent of US transportation-related emissions and addressing this pollution source is “essential if we hope to put the , and the economy, on track to achieve net-zero greenhouse gas emissions by 2050,” the White House said in a statement.

Like gasoline, traditional jet fuel is a petroleum-based energy source that is processed from crude at .

The White House measures include a proposed tax credit on fuels that achieve at least a 50 percent drop in “lifecycle” emissions, according to the fact sheet.

The administration also highlighted some $4.3 billion in funding aimed at new fuel development as well as a sustainable fuel “grand challenge” to boost its use by 2030.

The aviation announcement comes on the heels of earlier Biden administration initiatives to address , including ambitious targets for increasing solar as a US power source and boosting sales of electric cars.



© 2021 AFP

Citation:
Biden administration targets 20% drop in aviation emissions (2021, September 9)
retrieved 9 September 2021
from https://phys.org/news/2021-09-biden-administration-aviation-emissions.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove The seismic zone worrying quake-hit Mexico thumbnail

Hexbyte Glen Cove The seismic zone worrying quake-hit Mexico

Hexbyte Glen Cove

Workers remove rocks from a highway after an earthquake near Mexico’s Pacific resort of Acapulco.

A 7.1-magnitude earthquake that struck southern Mexico this week originated near a section of a fault that experts worry has been building up tension for more than a century.

The powerful tremor that occurred on Tuesday near the Pacific resort city of Acapulco left at least one person dead and damaged hospitals, homes, shops and hotels.

Mexico sits in the world’s most seismically and volcanically active zone, known as the Ring of Fire, where the Pacific plate meets surrounding .

On September 19, 1985 an 8.1-magnitude quake devastated Mexico City, killing more than 10,000 people and destroying hundreds of buildings.

On the anniversary of that earthquake in 2017, a 7.1 quake left around 370 people dead, mainly in the capital.

Where did latest quake occur?

The epicenter was located 11 kilometers (seven miles) southeast of Acapulco in Guerrero state, the National Seismological Service reported.

The region is vulnerable to due because of the Cocos and North American tectonic plates grinding against each other.

“The undersea Cocos plate is constantly trying to push below North America, the continental where most of Mexico’s territory sits,” said Arturo Iglesias, head of the National Seismological Service.

Map of Mexico showing five tectonic plates.

Within this zone lies what is known as the Guerrero gap—part of a fault stretching from Acapulco 230 kilometers northwest along the coast to the town of Papanoa.

A seismic gap is what the United States Geological Survey describes as a section of a fault that has produced earthquakes in the past but is now quiet.

Why is it causing concern?

Although earthquakes cannot be predicted, the Guerrero gap is a source of worry because it has not produced a significant earthquake since 1911.

“It’s an area off the coast of Guerrero that has not had a for a long time,” said Ana Maria Soler, an expert at the National Autonomous University of Mexico.

“There is no certainty. These are all questions of statistics,” she added.

Iglesias noted that it has been 110 years since the Guerrero gap produced a major earthquake.

“They typically occur every 30 to 50 years. It’s already more than that,” he said.

The Pacific Ring of Fire.

What could happen?

Although experts are reluctant to make predictions, they believe that a large earthquake, or several smaller ones, could occur in the zone.

“The northwestern edge of the gap already broke in an earthquake in 2014, so what remains can generate an of up to 7.8 magnitude, but it may be more smaller earthquakes,” Iglesias said.

Tremors in the area may also happen less frequently than originally thought, he said.

Mexico City, which together with surrounding urban areas is home to more than 20 million people, is seen as vulnerable to a quake in the Guerrero gap.

The capital sits atop a former lake bed and has an early warning alarm system using seismic monitors that aims to give residents enough time to evacuate buildings.

“It’s well known that this area (the Guerrero gap) has significant potential to generate earthquakes,” Iglesias said.

“So much so that we have an early warning system for Mexico City,” he added



© 2021 AFP

Citation:
The seismic zone worrying quake-hit Mexico (2021, September 9)
retrieved 9 September 2021
from https://phys.org/news/2021-09-seismic-zone-quake-hit-mexico.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Limiting fossil fuel extraction to keep global warming below 1.5° C target thumbnail

Hexbyte Glen Cove Limiting fossil fuel extraction to keep global warming below 1.5° C target

Hexbyte Glen Cove

CC0:Public Domain

Nearly 60% of both oil and fossil methane gas and almost 90% of coal must remain in the ground by 2050 in order to keep global warming below 1.5° C, finds a study by UCL researchers.

Global oil and gas production must decline by 3% annually until 2050 in order to reach this target. Many fossil fuel extraction projects, both planned and operational, are not conducive to meeting internationally agreed target limits on global warming, as set out by the Paris Climate Agreement in 2015. A significant number of regions have therefore already reached peak fossil fuel production, and any increase in production from one region must be offset by a greater production decline elsewhere.

The findings, published in Nature, are based on a 50% probability of limiting warming to 1.5° C this century, meaning that increasing the likelihood of reaching this target would require an even more rapid decline in production and more fossil fuels left in the ground.

The researchers used a global energy system model to assess the amount of fossil fuels that would need to be left unextracted regionally and globally.

The required unextracted reserves are estimated as the percentage of the 2018 reserve base. This needs to be 58% for oil, 59% for fossil methane gas and 89% for coal by 2050.

Lead author Dan Welsby (UCL Institute for Sustainable Resources) says that “in 2015, 196 parties signed the Paris Climate Agreement, aiming to keep average global temperature rise to well-below 2° C, with 1.5° C the desired target. Since then, the IPCC Special Report on 1.5° C, successive Production Gap Reports and the IEA Net Zero Report have indicated beyond doubt that dramatic cuts in fossil fuel production are required immediately in order to move towards net zero emissions, and that current and indicated fossil fuel production trajectories are moving us in the wrong direction.

“Our new paper adds further weight to recent research, indicating that global oil and fossil methane gas production has already peaked. From a regional perspective, our results suggest significant transition risk for large fossil fuel producers. Oil production in the Middle East for example roughly halves between 2020 and 2050, suggesting the diversification of economies away from a dependence on hydrocarbon revenues is absolutely critical.”

The work builds on previous research in 2015, which found that in order to limit warming to 2° C, a third of oil reserves, nearly half of fossil methane gas (49%) reserves and over 80% of coal reserves should remain in the ground.

The researchers used the TIMES Integrated Assessment Model at UCL (TIAM-UCL). The model captures primary energy sources—oil, fossil methane gas, coal, nuclear, biomass and renewables—from production through to conversion (e.g. electricity, hydrogen and biofuel production or oil refining), and distribution to meet a set of demands in each end-use sector.

Countries of the world are represented as 16 regions, which allows for a detailed characterization of regional energy sectors. The model assessed different scenarios including lower demands in key carbon intensive sectors (aviation and chemicals) and uncertainty around the availability and deployment of key carbon capture, utilization and storage (CCUS) and negative emissions technologies (NETs).

In terms of the regional distribution of unextractable fossil fuel reserves, the researchers found that the Middle East must leave around 60% of oil and gas reserves in the ground, which given the large size of its reserve base also results in huge absolute volumes. Additionally, regions with high concentrations of relatively high cost and high carbon intensive deposits of oil within the reserve base show high proportions of unextractable reserves including the oil sands in Canada (83%) and ultra-heavy oil in Central and South America (73%). The regional differences in the proportion of fossil fuels which must remain unextracted is down to a combination of factors including extraction costs, the carbon intensity of production and the costs of alternative technologies to fossil fuels.

Mr. Welsby continued, “We stress that our estimates of unextractable reserves and production decline rates are likely underestimates, given we use a budget consistent with only a 50% chance of meeting 1.5° C and the huge uncertainty around the deployment of negative emission technologies. However, assuming the political will is present to fulfill the commitments made in Paris, the reductions in suggested in our work are entirely feasible.”



More information:
Dan Welsby et al, Unextractable fossil fuels in a 1.5° C world, Nature (2021). DOI: 10.1038/s41586-021-03821-8

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Cats less stressed after adoption by families with children with autism, study finds thumbnail

Hexbyte Glen Cove Cats less stressed after adoption by families with children with autism, study finds

Hexbyte Glen Cove

Credit: Pixabay/CC0 Public Domain

While researchers have found that adding a shelter cat to the family can help lower stress and anxiety for children with autism, a new study at the University of Missouri shows that joining a family does wonders for the felines, too.

“It’s not only important to examine how families of children with autism may benefit from these wonderful companion animals, but also if the relationship is stressful or burdensome for the shelter cats being adopted into a new, perhaps unpredictable environment,” said Gretchen Carlisle, a research scientist at the MU Research Center for Human-Animal Interaction (ReCHAI) in the MU College of Veterinary Medicine. “In our study, we found the cats acclimated well to their new families and became significantly less stressed over time.”

The findings, published Monday in Frontiers in Veterinary Science, highlight the mutual benefits of human-animal interaction and build off previous MU research that found pets may help reduce stress and anxiety for both children with autism and their parents.

Carlisle and her team monitored shelter cats for 18 weeks after being adopted by Missouri families with at least one child with autism. The cats were first screened using the Feline Temperament Profile to identify shelter cats with a calm and laid-back temperament. After families selected a cat that had passed the screening, researchers made home visits to check on the cats 2-3 days after adoption and then every six weeks for 18 weeks, to see how they acclimated to their newly adopted families.

“Cortisol is a stress measure we tracked through collecting samples of the cats’ feces, and we noticed a significant decrease in over time,” Carlisle said. “Cats also tend to lose weight due to not eating if they are stressed, but we found the cats actually gained a bit of weight initially after adoption and then maintained their weight as time went on, so both findings indicated the cats acclimated well.”

Carlisle explained that children with autism may have sensitivity or sensory issues and occasional problem behaviors accompanied by loud, sudden outbursts. Because of those concerns, shelter cats that have been screened for a calm, easy-going temperament may increase the likelihood of a better long-term match for both the children and the cat.

“It’s crucial to look after the welfare of the cats from a humanitarian standpoint, and this research also helps animal shelter staff overcome the financial and management hurdles that can result when cats are returned to shelters if there is not a good fit with the adopted family,” Carlisle said. “Obviously, the shelters want to place all of their cats in homes, but some families may require a more specific fit, and using research-based, objective measurements for screening temperament may help increase the likelihood of successful, long-term matches. Our hope is that other scientists will build on the work of our exploratory study so shelter cats and families of children with autism might benefit.”

“Exploratory study of fecal cortisol, weight and behavior as measures of and welfare in shelter cats during assimilation into families of children with autism spectrum disorder” was published Monday in Frontiers in Veterinary Science.



Citation:
Cats less stressed after adoption by families with children with autis

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Megaprojects and the 'need for speed': How political indecision affects timelines of large infrastructure investments thumbnail

Hexbyte Glen Cove Megaprojects and the ‘need for speed’: How political indecision affects timelines of large infrastructure investments

Hexbyte Glen Cove

Credit: Pixabay/CC0 Public Domain

Building a new transit line or a highway almost always takes longer than initially planned. A new study suggests that the biggest contributor to such delays isn’t necessarily the design, planning or even construction phases. Instead, it’s the time required for local or provincial authorities to decide what exactly they want to build.

“Here in Toronto, there is a strong need for infrastructure, such as more housing and better —and really, we needed it years ago,” says University of Toronto Engineering Professor Shoshanna Saxe, one of the lead authors of a new study recently published in the European Journal of Transport and Infrastructure Research.

“With this big a deficit, there is a sense of urgency and a need to move fast. On the other hand, large infrastructure is expensive, permanent and causes a lot of disruption. You don’t want to end up building the wrong infrastructure by not taking the time to listen to the people who will be affected by it.”

Back in 2017, a series of joint workshops hosted by the University of Toronto and University College London (UCL) brought Saxe together with a new team of collaborators, including her fellow U of T Professor Matti Siemiatycki and Dr. Daniel Durrant at UCL’s Bartlett School of Planning.

Given their shared interest in infrastructure, they decided to crunch the numbers on the megaprojects from both Toronto and London, U.K., for which they could gather data.

The goal was to look at how long it takes to go from idea to opening for , as well as how that time is being spent and whether projects that spend years in deliberation actually benefit from that opportunity for sober second thought.

“We looked at whether the gestational period led to any change along two metrics: either a location change—for example, rail stations in a different location—or a in the project,” says Saxe.

“If we can point to either of those, it suggests that the time spent considering and reconsidering the at least resulted in some change, potentially for the better. But for a lot of projects we looked at, things didn’t change that much.”

The team analyzed 16 projects with a total cost of more than $500 million from Toronto, and a further 10 with a cost of more than £500 million from London. Of these, more than half did not change significantly by the time construction started.

This is despite the fact that, in many cases, the informal planning periods lasted much longer than the actual building phase.

“Consultation and consideration before we invest billions of dollars is important,” says Saxe. “But there is also an opportunity cost to not acting. There’s a lot of good tied up in these projects that we could have had much earlier, if we had moved faster.”

Saxe says that in her opinion, the of this ‘analysis paralysis’ are evident in the current landscape of the Greater Toronto Area.

“Not building something is a choice,” she says. “Here in Toronto, we allowed our population to outgrow the carrying capacity of our public transportation network, to the point where a huge number of people now have no choice but to drive cars, with all of the cost, pollution and congestion that come with that.”

“We didn’t consciously design that system—rather it was a consequence of not choosing to design a different one. What I would like to see is an honest discussion about priorities, and a commitment to following through with the funding necessary to make them a reality.”



Citation:
Megaprojects and the ‘need for speed’: How political indecision affects timelines of large infrastructure investments (2021, September 8)
retrieved 8 September 2021
from https://phys.org/news/2021-09

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Climate crisis could double frequency of extreme regional summer droughts in Europe thumbnail

Hexbyte Glen Cove Climate crisis could double frequency of extreme regional summer droughts in Europe

Hexbyte Glen Cove

Figure: Drying stripes for two selected hot-spot regions showing the percent of normal index (PNI), which gives the percentage of precipitation of a given month and year compared to the long-term mean in a counterfactual world with pre-industrial greenhouse gas concentrations. The PNI values were calculated for each climate simulation of the ensemble, before the median of those 50 values. Credit: Magdalena Mittermeier

The ongoing climate crisis has already had drastic global impacts. For example, droughts have become critical high-impact hydro-meteorological hazards worldwide. In Europe, the consequences of the climate crisis have been severe, with droughts having caused considerable social, economic, and environmental costs already, especially in the years 2003, 2010, and 2018. 

Climate projections show that more frequent and are expected by the end of the 21st century. Researching the future occurrence of droughts is crucial for adequate crisis mitigation. A new study published in Frontiers in Water shows that Europe is headed towards a future of increased severe to extreme droughts. 

“Summer droughts are a highly relevant topic in Europe,” said author Magdalena Mittermeier, who shares the first authorship with Andrea Böhnisch, both from the Ludwig-Maximilians-Universität München (LMU) in Germany. “We find a clear trend towards more, longer and more intense summer droughts, in terms of a precipitation deficit, towards the end of the century under a high-emission carbon scenario (RCP8.5).”

According to the World Health Organization (WHO), droughts are the most serious hazard to crops and livestock in every part of the world with estimated 55m people globally affected by them every year.  

The impacts of droughts are economically, socially, and environmentally complex, and a universal definition that covers all consequences does not exist. Instead, droughts are classified by their impact, as meteorological, hydrological, agricultural, or socio-economic. Meteorological droughts are a potential predecessor of other types and are therefore important to research. 

Key findings

  • In the long-term future (from 2080 to 2099), Europe will see an increase in the frequency and intensity of summer droughts and a decreasing number of winter droughts in several regions of various climates.
  • In the present day and far future, there is a high variability of drought intensities across various European climate regions.
  • There will be greater differences between winter and summer precipitation: it will increase during winter and decrease during summer.
  • For mid-Europe, the annual occurrence probability of an extreme drought strongly increases during the summer months, amounting to 25%. In Eastern Europe and the Alps, severe and extreme droughts have higher probabilities in the far future with values around 20% (severe) and 40% (extreme).
  • For France, the models predict an increase in the frequency of extreme droughts of up to 60%.
  • In the Mediterranean, the percentages of extreme droughts in the far future reach around 80% for the summer months. In the Iberian Peninsula the percentage of is the highest of all regions, reaching 96% in July and 88% in August. In these two regions, however, the absolute precipitation values in July and August are already low in the reference period (each contributing only around 2-3 % to the annual precipitation). This means that comparatively small absolute changes can lead to high percentages of the PNI, which is a relative measure.
  • Four future drought hot spots were identified: France, the Alps, the Mediterranean, and the Iberian Peninsula. These could see an increase of more than 50 % in the frequency of extreme summer droughts.

Researching drought occurrence

Regional differences between drought events are high, and there is an urgent need to identify geographical hot spots for future drought events. Böhnisch and her colleagues at the Ludwig-Maximilians University and the Ouranos Consortium in Canada assessed current and future climate trends and drought hot spots for Europe. 

The authors divided Europe into eight regions with different climates: the British Isles, Scandinavia, mid-Europe, the Alps, Eastern Europe, France, the Mediterranean and the Iberian Peninsula. The researchers then analyzed the ´percent of normal index´ (PNI, which gives the percentage of precipitation in a given period compared to the normal precipitation in the reference period) in a single climate model over the eight regions. A long-term future (from 2080 to 2099) under the Representative Concentration Pathway 8.5 was compared to the present day (2001 to 2020). 

Four future hot spots

The results show an overall increase in drought numbers, with high variability of drought intensities between regions in the present day period and the projected far future. In the long-term future, summer droughts are projected to become more extreme and winter droughts will become less frequent in several regions.

Four with strong drought trends were identified: France, the Alps, the Mediterranean and the Iberian Peninsula.

“Our study shows that unabated climate change will worsen the risk of hot-spot droughts drastically. But also, in some regions where droughts currently play a minor role, the future drought risk is expected to get serious. We show that the Alps should be considered an additional future hot-spot,” said Mittermeier.

“Unmitigated climate change, under the RCP8.5 scenario, will drastically increase the frequency, duration and intensity of summer droughts in many European regions. Such extreme effects can be avoided by climate mitigation. This is why consistent mitigation of climate change as agreed on under the Paris Agreement is highly relevant in terms of droughts in Europe.” 

She continued: “These three key features of: first, increasing drought occurrence in summer; second, wetter conditions in winter as well as; and third, interannual variations due to the natural variability of the climate system are visualized in what we call ‘drying stripes’.”

“These allow an overview of our results at first glance. The drying stripes show the percentage of precipitation for every month and year summarized over our ensemble compared to the long-term mean in a counterfactual world with pre-industrial greenhouse gas concentrations. With this, they show the projected drying trend throughout the 21st century compared to a world without climate change.”



More information:
Andrea Böhnisch et al, Hot Spots and Climate Trends of Meteorological Droughts in Europe–Assessing the Percent of Normal Index in a Single-Model Initial-Condition Large Ensemble, Frontiers in Water (2021). DOI: 10.3389/frwa.2021.716621

Citation:
Climate crisis could double frequency of extreme regional summer droughts in Europe (2021, September 7)
retrieved 7 September 2021
from https://phys.org/news/2021-09-clim

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove New filtering method promises safer drinking water, improved industrial production thumbnail

Hexbyte Glen Cove New filtering method promises safer drinking water, improved industrial production

Hexbyte Glen Cove

Credit: Pixabay/CC0 Public Domain

A team of scientists at the Tufts University School of Engineering has developed a new filtering technology. Inspired by biology, it could help curb a drinking water-related disease that affects tens of millions of people worldwide and potentially improve environmental remediation, industrial and chemical production, and mining, among other processes.

Reporting in the Proceedings of the National Academy of Sciences, the researchers demonstrated that their novel polymer membranes can separate fluoride from chloride and other ions—electically charged atoms—with twice the selectivity reported by other methods. They say application of the technology could prevent fluoride toxicity in where the element occurs naturally at levels too high for human consumption.

It is well known that adding fluoride to a water supply can reduce the incidence of tooth decay, including cavities. Less well known is the fact that some groundwater supplies have such high natural levels of fluoride that they can lead to severe health problems. Prolonged exposure to excess fluoride can cause fluorosis, a condition that can actually weaken the teeth, calcify tendons and ligaments, and lead to bone deformities. The World Health Organization estimates that excessive fluoride concentrations in drinking-water have caused tens of millions of dental and skeletal fluorosis cases worldwide.

The ability to remove fluoride with a relatively inexpensive filtering could protect communities from fluorosis without requiring the use of high-pressure filtration or having to completely remove all components and then re-mineralize the drinking water.

“The potential for ion selective membranes to reduce excess fluoride in drinking water supplies is very encouraging,” said Ayse Asatekin, associate professor of chemical and in the School of Engineering. “But the technology’s potential usefulness extends beyond drinking water to other challenges. The method we used to manufacture the membranes is easy to scale up for industrial applications. And because the implementation as a filter can also be relatively simple, low cost and environmentally sustainable, it could have wide applications to improving agricultural water supplies, cleaning up chemical waste, and improving chemical production.

For example, theoretically the process could improve yields from limited geological reserves of lithium for sustainable lithium battery production or uranium needed for nuclear power generation, said Asatekin.

In developing the design of the synthetic membranes, Asatekin’s team was inspired by biology. Cell membranes are remarkably selective in allowing the passage of ions into and out of the cell, and they can even regulate the internal and external concentrations of ions and molecules with great precision.

Biological ion channels create a more selective environment for the passage of these small ions by lining the channels with functional chemical groups that have different sizes and charges and different affinity for water. The interaction between the passing ions and these groups are forced by the nanometer dimensions of the channel pores, and the rate of passage is affected by the strength or weakness of the interactions.

The filtration membranes created by Asatekin’s team were designed by coating a zwitterionic polymer—a polymer in which molecular groups contain closely linked positive and on their surface—onto a porous support, creating membranes with channels narrower than a nanometer surrounded by both water repelling and plus and minus-charged chemical groups. As with the biological channels, the very small size of the pores forces the ions to interact with the charged and repelling groups in the pores, allowing some ions to pass much faster than others. In the current study, the composition of the polymer was made to target the selection of vs chloride. By altering the composition of the zwitterionic polymer, it should be possible to target the selection of different ions, the researchers say.

Most current filtering membranes separate molecules by significant differences in particle or molecular size and charge but have difficulty distinguishing single atom ions from each other because of their small size and when their electric charges are nearly identical.

By contrast, the Tufts researchers’ membranes are capable of separating ions that differ by only a fraction of their atomic diameter even when their electric charges are nearly identical.

Zwitterco, a Cambridge-based company which helped fund this work, will be exploring the scale up in manufacturing the ion separating membranes to test their application in industrial settings.



More information:
Interaction-based ion selectivity exhibited by self-assembled, cross-linked zwitterionic copolymer membranes, Proceedings of the National Academy of Sciences (2021). doi.org/10.1073/pnas.2022198118

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —

Hexbyte Glen Cove Fact-checking works across the globe to correct misinformation thumbnail

Hexbyte Glen Cove Fact-checking works across the globe to correct misinformation

Hexbyte Glen Cove

Credit: Unsplash/CC0 Public Domain

Fact-checking works to reduce false beliefs across the globe, according to a new study conducted in four countries.

Researchers found that fact-checking worked with little variation in Argentina, Nigeria, South Africa and the U.K., and the positive effects were still detectable two weeks later.

Even more encouraging, there was no evidence of a “backfire” effect of fact-checking, said Thomas Wood, co-author of the study and assistant professor of political science at The Ohio State University.

“When we started doing misinformation work about five years ago, it was the consensus that correcting misinformation wasn’t just ineffective, but that it was aggravating the problem and making people more entrenched in their false beliefs,” Wood said.

“We found no evidence of that in these four countries. What we did find was that fact-checking can be a very effective tool against misinformation.”

Wood conducted the study with Ethan Porter, assistant professor of media and public affairs at George Washington University. The research was published Sept. 6, 2021 in the Proceedings of the National Academy of Sciences.

The researchers worked with fact-checking organizations in the four countries that are part of the International Fact-Checking Network, an organization that promotes nonpartisan and transparent fact-checking. They evaluated five fact-checks that were unique to each country and two—regarding COVID-19 and climate change—that were tested in all four countries.

The fact-checks in each country, done in September and October 2020, covered a broad range of misinformation, including local politics, crime and the economy.

Some of the 2,000 participants in each country received only the misinformation, while others received the misinformation followed by the actual corrections used by local fact-checking organizations in response to misinformation.

They then rated how much they believed the false statement on a scale of 1-5.

In each country, members of a control group did not receive any misinformation or corrective statements, but simply rated how much they believed the statements.

When compared to misinformation, every fact-check produced more accurate beliefs, while misinformation didn’t always lead to less accurate beliefs when compared to the controls.

Results showed that fact-checks increased factual accuracy by .59 points on the five-point scale. Misinformation decreased factual accuracy by less than .07 on the same scale.

“Misinformation is far less persuasive than corrective information, by and large,” Wood said.

In three of the countries (South Africa, Argentina and the U.K.), the researchers returned two weeks later and asked participants how much they believed the false statements they evaluated earlier. Results showed that the positive effects of fact-checking were still robust two weeks later.

Two topics were tested in all four countries. One involved climate change, testing how much people believed the false statement, commonly shared at the time, that there were two years of record-breaking global cooling between 2016 and 2018. Another tested the false statement, which was widely shared near the beginning the COVID-19 pandemic, that gargling saltwater would prevent infection with the coronavirus.

Results showed that exposure to the climate change misinformation did not uniformly lead to people being less accurate on that issue.

But the misinformation regarding COVID-19 did lower accuracy in three of the four countries and had the largest misinformation effects found in the study. However, the fact-checks did help boost accuracy on this issue.

All participants also completed measurements of their political beliefs, to see if that influenced how they were swayed by fact-checks.

Results showed that participants’ reactions to fact-checks were connected to their beliefs—but in no case did an ideological group become more inaccurate because they were exposed to a correction.

“Some corrections didn’t improve the accuracy for some ideological groups, but they didn’t provoke any instances of backfire,” Wood said.

“Overall, the beliefs of adherents of the left, center and right alike were made more accurate by fact-checks, even when topics were politically charged.”

Wood noted that studies suggesting that fact-checking can backfire were mostly done in the United States and countries like it—what researchers call WEIRD samples, for Western, Educated, Industrialized, Rich and Democratic.

People in countries with these WEIRD populations, where political ideology plays a strong role in beliefs, may behave somewhat differently from those in other parts of the world. In fact, in this new study, the corrective effects of fact-checking were smallest in the U.K., which is the most WEIRD country of the four, Wood said.

“People in less ideological countries are going to be more factually adherent,” he said.

But still, Wood emphasized that this study showed that fact-checking was valuable in four countries that were diverse among racial, economic and political lines.

“Fact-checking is a powerful tool,” he said.

These results will be especially important as the COVID-19 vaccines become more accessible in places like Africa, he said. While dangerous misinformation about vaccine safety and efficacy may also follow there, this study suggests a rigorous program of fact-checking on social media and elsewhere could be effective at battling falsehoods.

“These populations may be even more receptive to corrections of misinformation than those from richer, more industrialized countries where ideology is more important,” Wood said.



More information:
The global effectiveness of fact-checking: Evidence from simultaneous experiments in Argentina, Nigeria, South Africa

Read More Hexbyte Glen Cove Educational Blog Repost With Backlinks —