Wednesday, November 27, 2024
Home Blog Page 1084

Study finds plants store carbon for shorter periods than thought

0
Study finds plants store carbon for shorter periods than thought


amazon rainforest
Credit: CC0 Public Domain

The carbon stored globally by plants is shorter-lived and more vulnerable to climate change than previously thought, according to a new study.

The findings have implications for our understanding of the role of nature in mitigating climate change, including the potential for nature-based carbon removal projects such as mass tree-planting.

The research, carried out by an international team led by Dr. Heather Graven at Imperial College London and published in Science, reveals that existing climate models underestimate the amount of carbon dioxide (CO2) that is taken up by vegetation globally each year, while overestimating how long that carbon remains there.

Dr. Graven, Reader in Climate Physics in Imperial’s Department of Physics, said, “Plants across the world are actually more productive than we thought they were.”

The findings also mean that while carbon is taken up by plants quicker than thought, the carbon is also locked up for a shorter time, meaning carbon from human activities will be released back into the atmosphere sooner than previously predicted.

Dr. Graven added, “Many of the strategies being developed by governments and corporations to address climate change rely on plants and forests to draw down planet-warming CO2 and lock it away in the ecosystem.

“But our study suggests that carbon stored in living plants does not stay there as long as we thought. It emphasizes that the potential for such nature-based carbon removal projects is limited, and fossil fuel emissions need to be ramped down quickly to minimize the impact of climate change.”






Video abstract. Credit: Heather Graven / Imperial College London

Using carbon

Until now, the rate at which plants use CO2 to produce new tissues and other parts globally—a measure known as Net Primary Productivity—has been approximated by scaling up data from individual sites. But the sparsity of sites with comprehensive measurements means it has not been possible to accurately calculate Net Primary Productivity globally.

Plants’ productivity has been increasing since the early 1900s and more CO2 is currently taken up by plants than is released back to the air. Researchers know that approximately 30% of CO2 emissions by human activities are therefore stored in plants and soils each year, reducing climate change and its impacts.

However, the details of how this storage happens, and its stability into the future, are not yet well understood.

In this study, radiocarbon (14C)—a radioactive isotope of carbon—was combined with model simulations to understand how plants use CO2 at a global scale, unlocking valuable insights into the interaction between the atmosphere and the biosphere.

Tracking carbon from bomb tests

Radiocarbon is produced naturally, but nuclear bomb testing in the 1950s and 1960s increased the level of 14C in the atmosphere. This extra 14C was available to plants globally, giving researchers a good tool to measure how fast they could take it up.

By examining the accumulation of 14C in plants between 1963 and 1967—a period when there were no significant nuclear detonations and the total 14C in the Earth system was relatively constant—the authors could assess how quickly carbon moves from the atmosphere to vegetation and what happens to it once it’s there.

The results show that current, widely-used models that simulate how land and vegetation interact with the atmosphere underestimate the net primary productivity of plants globally. The results also show that the models overestimate the storage time of carbon in plants.

Role of the biosphere

Co-author Dr. Charles Koven, from Lawrence Berkeley National Laboratory, U.S., said, “These observations are from a unique moment in history, just after the peak of atomic weapons testing in the atmosphere in the 1960s.

“The observations show that the growth of plants at the time was faster than current climate models estimate that it was. The significance is that it implies that carbon cycles more rapidly between the atmosphere and biosphere than we have thought, and that we need to better understand and account for this more rapid cycling in climate models.”

The authors say the research demonstrates the need to improve theories about how plants grow and interact with their ecosystems, and to adjust global climate models accordingly, to better understand how the biosphere is mitigating climate change.

Co-author Dr. Will Wieder, from the National Center for Atmospheric Research, U.S., said, “Scientists and policymakers need improved estimates of historical land carbon uptake to inform projections of this critical ecosystem service in future decades. Our study provides critical insights into terrestrial carbon cycle dynamics, which can inform models that are used for climate change projections.”

The work highlights the usefulness of radiocarbon measurements in helping to unpick the complexities of the biosphere. The study’s authors include German physicist Ingeborg Levin, a pioneer in radiocarbon and atmospheric research, who sadly died in February.

More information:
Heather D. Graven, Bomb radiocarbon evidence for strong global carbon uptake and turnover in terrestrial vegetation, Science (2024). DOI: 10.1126/science.adl4443. www.science.org/doi/10.1126/science.adl4443

Citation:
Study finds plants store carbon for shorter periods than thought (2024, June 20)
retrieved 29 June 2024
from https://phys.org/news/2024-06-carbon-shorter-periods-thought.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

A new approach to using neural networks for low-power digital pre-distortion in mmWave systems

0
A new approach to using neural networks for low-power digital pre-distortion in mmWave systems


Learning the imperfections: a new approach to using neural networks for low-power digital pre-distortion (DPD) in mmWave systems
Overview of the proposed architecture for digital pre-distortion (DPD), connected in between the baseband layer (BB) and the analog front-end (AFE). The top part operates continuously during transmission, whereas the bottom part is only involved in updating the calibration coefficients. Credit: Ludovico Minati / Tokyo Tech

In a study published in the journal IEICE Electronics Express, researchers present a neural network digital pre-distortion (DPD) for mmWave RF-PAs.

In the world around us, a quiet but very important evolution has been taking place in engineering over the last decades. As technology evolves, it becomes increasingly clear that building devices that are physically as close as possible to being perfect is not always the right approach. That’s because it often leads to designs that are very expensive, complex to build, and power-hungry.

Engineers, especially electronic engineers, have become skilled in using highly imperfect devices in ways that allow them to behave close enough to the ideal case to be successfully applicable. Historically, a well-known example is that of disk drives, where advances in control systems have made it possible to achieve incredible densities while using electromechanical hardware littered with imperfections, such as nonlinearities and instabilities of various kinds.

A similar problem has been emerging for radio communication systems. As the carrier frequencies keep increasing and channel packing becomes more and more dense, the requirements in terms of linearity for the radio-frequency power amplifiers (RF-PAs) used in telecommunication systems have become more stringent. Traditionally, the best linearity is provided by designs known as “Class A,” which sacrifice great amounts of power to maintain operation in a region where transistors respond in the most linear possible way.

On the other hand, highly energy-efficient designs are affected by nonlinearities that render them unstable without suitable correction. The situation has been getting worse because the modulation systems used by the latest cellular systems have a very high power ratio between the lowest- and highest-intensity symbols. Specific RF-PA types such as Doherty amplifiers are highly suitable and power-efficient, but their native non-linearity is not acceptable.

Over the last two decades, high-speed digital signal processing has become widely available, economical, and power-efficient, leading to the emergence of algorithms allowing the real-time correction of amplifier non-linearities through intentionally “distorting” the signal in a way that compensates the amplifier’s physical response.

These algorithms have become collectively known as digital pre-distortion (DPD), and represent an evolution of earlier implementations of the same approach in the analog domain. Throughout the years, many types of DPD algorithms have been proposed, typically involving real-time feedback from the amplifier through a so-called “observation signal,” and fairly intense calculations.

While this approach has been instrumental to the development of third- and fourth-generation cellular networks (3G, 4G), it falls short of the emerging requirements for fifth-generation (5G) networks, due to two reasons. First, dense antenna arrays are subject to significant disturbances between adjacent elements, known as cross-talking, making it difficult to obtain clean observation signals and causing instability.

The situation is made considerably worse by the use of ever-increasing frequencies. Second, dense arrays of antennas require very low-power solutions, and this is not compatible with the idea of complex processing taking place for each individual element.

“We came up with a solution to this problem starting from two well-established mathematical facts. First, when a non-linearity is applied to a sinusoidal signal, it distorts it, leading to the appearance of new frequencies. Their intensity provides a sort of signature, that, if the non-linearity is a polynomial, is almost univocally associated with a set of coefficients. Second, multi-layer neural networks, of the early kinds, introduced decades ago, are universal function approximations, therefore, are capable of learning such an association, and inverting it,” explains Prof. Ludovico Minati, leading inventor of the patent on which the study is based and formerly a specially-appointed associate professor at Tokyo Tech.

The most recent types of RF-PAs based on CMOS technology, even when they are heavily nonlinear, tend to have a relatively simple response, free from memory effects.

“This implies that the DPD problem can be reduced to finding the coefficients of a suitable polynomial, in a way that is quick and stable enough for real-world operation,” explains Dr. Aravind Tharayil Narayanan, lead author of the study. Through a dedicated hardware architecture, the engineers at the Nano Sensing Unit of Tokyo Tech were able to implement a system that automatically determines the polynomial coefficients for DPD, based on a limited amount of data that could be acquired within the course of a few milliseconds.

Performing calibration in the “foreground,” that is, one path at a time, reduces issues related to cross-talk and greatly simplifies the design. While there is no observation signal needed, the calibration can adjust itself to varying conditions through the inputs of additional signals, such as die temperature, power supply voltage, and settings of the phase shifters and couplers connecting the antenna. While standards compliance may pose some limitations, the approach is in principle widely applicable.

“Because there is very limited processing happening in real-time, the hardware complexity is truly reduced to a minimum, and the power efficiency is maximized. Our results prove that this approach could in principle be sufficiently effective to support the most recent emerging standards. Another very convenient feature is that a considerable amount of hardware can be shared between elements, which is particularly convenient in dense array designs,” says Prof. Hiroyuki Ito, head of the Nano Sensing Unit of TokyoTech where the technology was developed.

As a part of an industry-academia collaboration effort, the authors were able to test the concept on realistic, leading-edge hardware operating at 28 GHz provided by Fujitsu Limited, working in close collaboration with a team of engineers in the Product Planning Division of the Mobile System Business Unit. Future work will include large-scale implementation using dedicated ASIC designs, detailed standards compliance analysis and realistic benchmarking on the field under a variety of settings.

An international PCT application for the methodology and design has been filed.

More information:
Aravind Tharayil Narayanan et al, A Neural Network-Based DPD Coefficient Determination for PA Linearization in 5G and Beyond-5G mmWave Systems, IEICE Electronics Express (2024). DOI: 10.1587/elex.21.20240186

Citation:
A new approach to using neural networks for low-power digital pre-distortion in mmWave systems (2024, May 10)
retrieved 29 June 2024
from https://techxplore.com/news/2024-05-approach-neural-networks-power-digital.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Others’ words, not firsthand experience, shape scientific and religious belief formation, study finds

0
Others’ words, not firsthand experience, shape scientific and religious belief formation, study finds


Others' words, not firsthand experience, shape scientific and religious belief formation, HKUST study finds
Three proposals explaining the differential confidence. (A) Intuitively, one might assume that confidence in scientific entities (e.g., germs) is higher than confidence in religious entities (e.g., angels) because scientific entities are observable in principle whereas religious entities are not. (B) The Dual- Pathway Model proposes separate information pathways for scientific versus religious beliefs; belief in scientific entities is primarily driven by direct experience of causal outcomes whereas belief in religious entities is primarily driven by testimony. (C) Contrary to these two models, the Unified Model proposes that belief in both scientific and religious entities is primarily driven by testimony, a form of cultural input. Credit: HKUST

An international research team led by the Hong Kong University of Science and Technology (HKUST) has uncovered in a recent research project that people’s beliefs in science and religion are primarily shaped by the words of others, rather than their personal experiences. The study could help enhance public understanding of people’s belief formation in important scientific issues, such as climate change and vaccination.

The team’s findings, realized in collaboration with researchers at Harvard University, Union College, and Massachusetts Institute of Technology, were published in Trends in Cognitive Sciences.

Conventionally, people are generally more confident about the existence of scientific phenomena, like oxygen, than religious phenomena, like God, as it is thought that people can experience oxygen, for instance, while it is harder to observe religious entities on one’s own.

The team, led by Prof. Mary Shaocong MA, a Research Assistant Professor from the Division of Social Science at HKUST, challenged the conventional view. They argued that both scientific and religious beliefs are primarily shaped by testimony or information we get from others, such as experts or our community, rather than personal experience.

The study highlights the decisive role of testimony in forming our beliefs and understanding of the world, contrary to the notion that direct experience is the main driver of scientific belief.

“Even when it seems like we’re experiencing something directly, our understanding is often heavily influenced by what we’ve been told by experts or our community. For example, witnessing a relative falling ill, it’s very hard for a child to detect that viruses cause illness; rather, they turn to others’ testimony, such as parents’ teaching, to understand the causal relations,” says Prof. Ma.

“Recognizing this can help determine the most effective way to communicate scientific information to the public. By highlighting the credibility and consensus of scientific evidence, it is possible to promote greater acceptance and confidence in scientific facts, especially regarding emerging scientific topics, such as climate change.

“This insight is crucial for combating misinformation and enhancing public understanding and support for scientific matters, such as addressing climate change and getting vaccinated,” she further explains.

In the research, by reviewing empirical evidence in the past few decades, the team proposed a theoretical model that explains how people come to believe in the existence of invisible entities, such as germs in science or God in religion.

For example, it finds that people believe in germs because doctors and scientists tell us they exist, even though we cannot see them with our own eyes. Likewise, we infer that people get sick because of germs by learning this causal relation from others rather than discovering this connection through personal observation.

The model also establishes that the more credible the source of the information and the more people who agree with it, the more likely people are to believe it. “If many people around us agree that climate change is real, their consensus strengthens our belief in these concepts,” she says.

It shows that people’s confidence in these phenomena is not because they have seen them directly but because they trust the sources that tell them about them.

Unlike previous models that proposed separate pathways for belief formation in science and religion, this model provides a unified explanation. It argues that others’ testimony, rather than direct experience, predominantly shapes beliefs in both domains.

More information:
Shaocong Ma et al, Scientific and religious beliefs are primarily shaped by testimony, Trends in Cognitive Sciences (2024). DOI: 10.1016/j.tics.2024.04.014

Citation:
Others’ words, not firsthand experience, shape scientific and religious belief formation, study finds (2024, June 4)
retrieved 29 June 2024
from https://phys.org/news/2024-06-words-firsthand-scientific-religious-belief.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Exploit steals passwords by tapping into keystrokes

0
Exploit steals passwords by tapping into keystrokes


hacker
Credit: Unsplash/CC0 Public Domain

Add one more threat to the list of risks you take when you use your phone to conduct business at the local coffee shop.

Researchers from universities in China and Singapore uncovered a security gap that permits snoops to lift your password by identifying your keystrokes.

Researchers are calling Wiki-Eve “the first WiFi-based hack-free keystroke eavesdropping system.”

The cyberattack demonstrated by the researchers is made possible thanks to a feature in wireless communications called BFI, beamforming feedback information. BFI permits devices to more accurately transmit feedback about their location, sending signals specifically towards the routers that are to receive them, instead of dispersing them omnidirectionally.

But one vulnerability of BFI, a component of the 802.11ac WiFi standard (also known as WiFi 5), is that it transmits data in cleartext. That means there is no need for physical hacking or cracking of an encryption key.

The researchers devised a means of identifying a user’s device and capturing the cleartext transmissions.

Unlike older side-channel attacks, Wiki-Eve does not require planting rogue programs that trick a user into logging on to an illegitimate site. It also does not require setting up additional links to sense a target user’s keystrokes.

“Since BFI is transmitted from a smartphone to an AP [access point] in cleartext,” the researchers said, “it can be overheard by any other Wi-Fi devices switching to monitor mode.”

Researchers said Wiki-Eve “achieves 88.9% inference accuracy for individual keystrokes and up to 65.8% top-10 accuracy for stealing passwords of mobile applications.”

Keystroke inference is the determination of what key is being pressed based on BFI data. As a user glides over keys on a keypad, the variations in wireless signals between device and base station can be tracked and identified with the aid of a deep-learning model.

The team ran tests using numerical passwords since they are easier to decipher than alphanumeric passwords.

They demonstrated Wiki-Eve by successfully lifting WeChat Pay passwords from a subject in a nearby conference room.

Wiki-Eve joins a long list of side-channel attack methods. Such methods include acoustic cryptanalysis that interprets sounds produced by a device during transmission, cache attacks that probe access patterns, electromagnetic analysis that uses radiation to decipher information, and thermal attacks that track temperature variations to reveal activities.

The study assumed users were engaging in activity over an unprotected network, common in public spaces such as coffeeshops, airports, train stations and other gathering places offering free WiFi.

The researchers had a simple recommendation for a defense against Wiki-Eve: “Since WiKI-Eve achieves keystroke eavesdropping by overhearing Wi-Fi BFI, the most direct defense strategy is to encrypt data traffic,” they said, “hence preventing attackers from obtaining BFI in cleartext.”

The researchers presented their study, “Password-Stealing without Hacking: Wi-Fi Enabled Practical Keystroke Eavesdropping,” on the preprint server arXiv. The team includes researchers from Hunan University and Fudan University in China, as well as Nanyang Technological University in Singapore.

More information:
Jingyang Hu et al, Password-Stealing without Hacking: Wi-Fi Enabled Practical Keystroke Eavesdropping, arXiv (2023). DOI: 10.48550/arxiv.2309.03492

Journal information:
arXiv


© 2023 Science X Network

Citation:
Exploit steals passwords by tapping into keystrokes (2023, September 13)
retrieved 29 June 2024
from https://techxplore.com/news/2023-09-exploit-passwords-keystrokes.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Quantum data assimilation offers new approach to weather prediction

0
Quantum data assimilation offers new approach to weather prediction


Quantum data assimilation offers new approach to weather prediction
A conceptual image of the quantum annealing. Credit: Nonlinear Processes in Geophysics (2024). DOI: 10.5194/npg-31-237-2024

Data assimilation is a mathematical discipline that integrates observed data and numerical models to improve the interpretation and prediction of dynamical systems. It is a crucial component of Earth sciences, particularly in numerical weather prediction (NWP).

Data assimilation techniques have been widely investigated in NWP in the last two decades to refine the initial conditions of weather models by combining model forecasts and observational data. Most NWP centers around the world employ variational and ensemble-variational data assimilation methods, which iteratively reduce cost functions via gradient-based optimization. However, these methods require significant computational resources.

Recently, quantum computing has emerged as a new avenue of computational technology, offering a promising solution for overcoming the computational challenges of classical computers.

Quantum computers can take advantage of quantum effects such as tunneling, superposition, and entanglement to significantly reduce computational demands. Quantum annealing machines, in particular, are powerful for solving optimization problems.

In a study, published in Nonlinear Processes in Geophysics, Professor Shunji Kotsuki from the Institute for Advanced Academic Research/Center for Environmental Remote Sensing/Research Institute of Disaster Medicine, Chiba University, along with his colleagues Fumitoshi Kawasaki from the Graduate School of Science and Engineering and Masanao Ohashi from the Center for Environmental Remote Sensing, developed a novel data assimilation technique designed for quantum annealing machines.

“Our study introduces a novel quantum annealing approach to accelerate data assimilation, which is the main computational bottleneck for numerical weather predictions. With this algorithm, we successfully solved data assimilation on quantum annealers for the first time,” explains Prof. Kotsuki.

In the study, the researchers focused on the four-dimensional variational data assimilation (4DVAR) method, one of the most widely used data assimilation methods in NWP systems. However, since 4DVAR is designed for classical computers, it cannot be directly used on quantum hardware.

Prof. Kotsuki says, “Unlike the conventional 4DVAR, which requires a cost function and its gradient, quantum annealers require only the cost function. However, the cost function must be represented by binary variables (0 or 1). Therefore, we reformulated the 4DVAR cost function, a quadratic unconstrained optimization (QUO) problem, into a quadratic unconstrained binary optimization (QUBO) problem, which quantum annealers can solve.”

The researchers applied this QUBO approach to a series of 4DVAR experiments using a 40-variable Lorentz-96 model, which is a dynamical system commonly used to test data assimilation.

They conducted the experiments using the D-Wave Advantage physical quantum annealer, or Phy-QA, and the Fixstars Amplify’s simulated quantum annealer, or Sim-QA. Moreover, they tested the conventionally utilized quasi-Newton-based iterative approaches, using the Broyden-Fletcher-Goldfarb-Shanno formula, in solving linear and nonlinear QUO problems and compared their performance to that of quantum annealers.

The results revealed that quantum annealers produced analysis with comparable accuracy to conventional quasi-Newton-based approaches but in a fraction of the time they took.

The D-Wave’s Phy-QA required less than 0.05 seconds for computation, much faster than conventional approaches. However, it also exhibited slightly larger root mean square errors, which the researchers attributed to the inherent stochastic quantum effects.

To address this, they found that reading out multiple solutions from the quantum annealer improved stability and accuracy. They also noted that the scaling factor for quantum data assimilation, which is important for regulating the analysis accuracy, was different for the D-Wave Phy-QA and the Sim-QA, owing to the stochastic quantum effects associated with the former annealer.

These findings signify the role of quantum computers in reducing the computational cost of data assimilation.

“Our approach could revolutionize future NWP systems, enabling a deeper understanding and improved predictions with much less computational time. In addition, it has the potential to advance the practical applications of quantum annealers in solving complex optimization problems in Earth science,” says Prof. Kotsuki.

Overall, the proposed innovative method holds great promise for inspiring future applications of quantum computers in advancing data assimilation, potentially leading to more accurate weather predictions.

More information:
Shunji Kotsuki et al, Quantum data assimilation: a new approach to solving data assimilation on quantum annealers, Nonlinear Processes in Geophysics (2024). DOI: 10.5194/npg-31-237-2024

Provided by
Chiba University


Citation:
Quantum data assimilation offers new approach to weather prediction (2024, June 13)
retrieved 29 June 2024
from https://phys.org/news/2024-06-quantum-assimilation-approach-weather.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link