Sunday, March 9, 2025
Home Blog Page 1452

Governments need to focus on AI’s real impact, not get caught up in the hype generated by Big Tech

0
Governments need to focus on AI’s real impact, not get caught up in the hype generated by Big Tech


algorithm data
Credit: Pixabay/CC0 Public Domain

Statistics Canada recently released a detailed report estimating which professions are likely to be affected by artificial intelligence in the next few years.

It concludes with an optimistic message for education and health-care professionals, suggesting that not only are they expected to retain their jobs, but their productivity will be enhanced by AI advancements. However, the outlook is grimmer for those in finance, insurance, information and cultural industries, who are predicted to see their careers derailed by AI.

Should doctors and teachers now breathe easy, while accountants and writers panic? Maybe, but not because of the data in this report.

What Statistics Canada offers here is a relatively meaningless exercise. It assumes that it is the technology itself and how well it complements human efforts, not the business models designed to undermine our shared humanity, that is the key determinant. By making this mistake, the report is yet another casualty of buying into corporate-driven optimism at the expense of uglier business realities.

High exposure to AI hype

Corporations pushing new innovations or products that play on our greatest hopes and fears is nothing new. The only thing that may be novel is the sheer scale of Big Tech’s hopes for AI impact, which seem to reach every industry.

It’s no surprise, then, that there is widespread fear about what industries and sectors will be replaced by AI. Nor is it surprising that Statistics Canada would seek to allay some of those fears.

The study groups jobs into three categories:

  • those with high AI exposure and low complementarity, meaning humans may be competing directly with machines for these roles;
  • those with high AI exposure and high complementarity, where automation could enhance the productivity of the workers who remain essential to the job;
  • and those with low AI exposure, where replacement doesn’t seem to be a threat yet.

The report’s authors claim their approach—examining the relationship between exposure and complementarity—is superior to older methods that looked at manual versus cognitive or repetitive versus non-repetitive tasks when analyzing the impact of automation on workplaces.

However, by focusing on these categories, the study still buys into corporate hype. These categories of analysis were developed in 2021. Over the past few years, new windows have opened up, allowing us a clearer view of the ways Big Tech is rushing to deploy AI. The newly revealed unethical tactics render the predictive categories of exposure and complementarity fairly meaningless.

AI is often driven by people

Recent developments have shown that even jobs with high AI exposure and low AI complementarity are still relying on humans behind the scenes to do essential work. Take Cruise, the self-driving car company bought by General Motors in 2016 for more than $1 billion. Cab driving is a job with high AI exposure and low AI complementarity—we assume a cab is either being controlled by a human driver or, if it’s driverless, by AI.

As it turns out, Cruise’s “autonomous” cabs in California were not, in fact, driverless. There was remote human intervention every few miles.

If we were to accurately analyze this job, there are three categories to consider. The first is for in-car human drivers, the second is remote human drivers and the third is autonomous AI-driven vehicles. The second category makes complementarity fairly high here. But the fact that Cruise, and likely other tech companies, tried to keep this under wraps raises a whole new world of questions.

A similar situation emerged at Presto Automation, a company specializing in AI-powered drive-thru ordering for chains like Checkers and Del Taco. The company described itself as one of the biggest “labor automation technology providers” in the industry, but it was revealed that much of its “automation” is driven by human labor based in the Philippines.

Software company Zendesk presents another example. It once charged customers based on how often the software was used to try to resolve customer problems. Now, Zendesk only charges when its proprietary AI completes a task without humans stepping in.

Technically, this scenario could be described as high exposure and high complementarity. But do we want to support a business model where the customer’s first point of contact is likely to be frustrating and unhelpful? Especially knowing businesses will roll the dice on this model because they won’t be charged for those unhelpful interactions?

Scrutinizing business models

As it stands, AI presents more of a business challenge than a technological one. Government institutions like Statistics Canada need to be careful not to amplify the hype surrounding it. Policy decisions need to be based on a critical analysis of how businesses actually use AI, rather than by inflated predictions and corporate agendas.

To create effective policies, it’s crucial that decision-makers focus on how AI is truly being integrated into businesses, rather than getting caught up in speculative forecasts that may never fully materialize.

The role of technology should be to support human welfare, not simply reduce labor costs for businesses. Historically, every wave of technological innovation has brought about concerns about job displacement. The fact that future innovations may replace human labor is not new or to be feared; instead, it should prompt us to think critically about how it’s being used, and who stands to benefit.

Policy decisions, therefore, should be rooted in accurate, transparent data. Statistics Canada, as a key data provider, has an essential role to play here. It needs to offer a clear, unbiased view of the situation, ensuring policymakers have the right information to make informed decisions.

Provided by
The Conversation


This article is republished from The Conversation under a Creative Commons license. Read the original article.The Conversation

Citation:
Researcher: Governments need to focus on AI’s real impact, not get caught up in the hype generated by Big Tech (2024, September 16)
retrieved 16 September 2024
from https://techxplore.com/news/2024-09-focus-ai-real-impact-caught.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Inspired by squids and octopi, a new screen stores and displays encrypted images without electronics

0
Inspired by squids and octopi, a new screen stores and displays encrypted images without electronics


This screen stores and displays encrypted images without electronics
Holding the screen up to an array of magnets of different strengths can rewrite the magnetic properties of the pixels in targeted areas of the screen. Different arrays of magnets will program different images into the device. Credit: Jeremy Little, Michigan Engineering

A flexible screen inspired in part by squid can store and display encrypted images like a computer—using magnetic fields rather than electronics. The research is reported in Advanced Materials by University of Michigan engineers.

“It’s one of the first times where mechanical materials use magnetic fields for system-level encryption, information processing and computing. And unlike some earlier mechanical computers, this device can wrap around your wrist,” said Joerg Lahann, the Wolfgang Pauli Collegiate Professor of Chemical Engineering and co-corresponding author of the study.

The researchers’ screen could be used wherever light and power sources are cumbersome or undesirable, including clothing, stickers, ID badges, barcodes and e-book readers. A single screen can reveal an image for everyone to see when placed near a standard magnet or a private encrypted image when placed over a complex array of magnets that acts like an encryption key.

“This device can be programmed to show specific information only when the right keys are provided. And there is no code or electronics to be hacked,” said Abdon Pena-Francesch, U-M assistant professor of materials science and engineering and co-corresponding author. “This could also be used for color-changing surfaces, for example, on camouflaged robots.”






Credit: University of Michigan Engineering

Shaking the screen erases the display—like an Etch-A-Sketch—except the image is encoded in the magnetic properties of beads inside the screen. It returns when the display is exposed to the magnetic field again.

The beads act like pixels by flipping between orange and white hemispheres. The orange halves of the beads contain microscopic magnetic particles that allow them to rotate up or down when exposed to a magnetic field, providing the color contrast needed to display an image.

Exposing the pixels to a magnet will program them to show either white or orange in either a pulling or pushing magnetic field—a state referred to as their polarization. For some pixels made with iron oxide magnetic particles, the polarization can be changed with relatively weak magnetic fields. But the polarization of pixels that also include neodymium particles is harder to change—a strong magnetic pulse is required.

This screen stores and displays encrypted images without electronics
Abdon Pena-Francesch, an assistant professor of materials science and engineering (left), and Zane Zhang, a doctoral student in materials science and engineering (right), view squid skin under a microscope. The researchers based the size of their screen’s pixels on the animal’s pigment sacs. Credit: Jeremy Little, Michigan Engineering

Holding the screen over a grid of magnets with different strengths and orientations can selectively change the polarization in some parts of the screen, causing some pixels to flip white and others to flip orange under the same magnetic field orientation. This is how an image is encoded.

Then, the image can be displayed under any weak magnetic field, including a regular magnet. But because iron oxide particles can be reprogrammed with relatively weaker fields, private images can be displayed with a second magnetic grid that selectively rewrites how some areas of the screen flip. When returned to the standard magnet, the iron oxide pixels revert back to their original polarization to show the public image.

Several private images can be displayed from a single public image, each with a unique key. The decoding keys can also be programmed to only work with specific encoding keys for extra security.

This screen stores and displays encrypted images without electronics
Pigment sacs speckle most of the surface of this squid specimen. Credit: Jeremy Little, Michigan Engineering

The team decided on the screen’s resolution by studying squids and octopi, which change color by expanding and contracting pigment sacs in their skin.

“If you make the beads too small, the changes in color become too small to see,” said Zane Zhang, U-M doctoral student in materials science and engineering and the study’s first author. “The squid’s pigment sacs have optimized size and distribution to give high contrast, so we adapted our device’s pixels to match their size.”

More information:
Zenghao Zhang et al, Janus Swarm Metamaterials for Information Display, Memory, and Encryption, Advanced Materials (2024). DOI: 10.1002/adma.202406149

Citation:
Inspired by squids and octopi, a new screen stores and displays encrypted images without electronics (2024, September 16)
retrieved 16 September 2024
from https://techxplore.com/news/2024-09-squids-octopi-screen-displays-encrypted.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Researchers develop precise pricing formula for perpetual American strangle options

0
Researchers develop precise pricing formula for perpetual American strangle options


Pusan National University researchers develop precise pricing formula for perpetual American strangle options
The insights gained from this study show that stochastic volatility has significant influence on the pricing of perpetual American strangle options and their boundary conditions, offering crucial insights for minimizing risk in volatile markets. Credit: Dr. Ji-Hun Yoon from Pusan National University, Korea

Perpetual American strangle options (PASOs) offer investors a method for minimizing risk during highly volatile market scenarios by allowing them to buy or sell options at any date without an expiration date. In a new study, researchers investigated the pricing of PASOs under a stochastic volatility model with fast mean reversion which better captures real markets compared to traditional models.

Options are a financial instrument that gives the holder the right to buy and sell an underlying asset, at a predetermined price, on or before a specified date. For example, European-style options allow the buyer to exercise this right at its maturity date, while American-style options can be exercised at any time up to and including the expiration date. These are generally traded in public financial markets, such as stock exchanges.

With the increasing complexity of markets, a wide range of products have emerged, including strangle options. A strangle option is an investment strategy that combines call and put options, both with the same expiration date but different strike prices. This strategy is typically used by investors who anticipate a large fluctuation in the market in either direction, as it helps minimize potential losses.

PASOs take this further by allowing the holder to exercise the options at any time, without an expiration date, providing considerable benefits. Consequently, PASOs have been the focus of considerable research. However, despite such studies, the pricing of PASOs and their early exercise boundaries have not yet been studied using a stochastic volatility (SV) model, which more accurately captures real market behavior compared to the Black-Scholes model.

Addressing this gap, a team of researchers led by Associate Professor Ji-Hun Yoon from Pusan National University, Korea developed a pricing formula for PASOs under an SV model with fast mean reversion. Their findings were made available online on July 27, 2024 in Mathematics and Computers in Simulation.

“In recent years, financial markets have experienced considerable fluctuations during global financial crisis, such as the US subprime mortgage crisis in 2007 and 2008, the Eurozone crisis in 2010, the COVID-19 pandemic, and the Russia-Ukraine conflict. American strangle options can help investors minimize risk during such crises,” says Dr. Yoon.

In this study, researchers first established a partial differential equation (PDE) for the value of PASOs under an SV model (PASOSV). A PDE is a mathematical equation that helps to model how one variable changes with respect to another.

In this case, the value of the PASOSV relative to the underlying asset’s price. However, due to the complexity of SV, an exact solution was not possible. Instead, the researchers applied an asymptotic analysis approach, incorporating a special term representing the fast reversion rate of highly volatile markets.

To validate their formula, they used the Monte-Carlo simulation method, which predicts potential future values of assets through thousands of simulated scenarios. They also conducted numerical simulations to analyze how SV impacts the option price and the free boundary values using various parameters. The findings revealed that SV significantly influences option prices and exercise boundary values when volatility is low, indicating that while high volatility can give higher returns, low volatility can increase risk from investing in PASOs.

“Our study lays the foundation for development of more resilient products by financial institutions, thereby providing investors with better tools and strategies to manage risk and maximize returns, especially in low volatility environments,” concludes Dr. Yoon.

More information:
Mijin Ha et al, Pricing for perpetual American strangle options under stochastic volatility with fast mean reversion, Mathematics and Computers in Simulation (2024). DOI: 10.1016/j.matcom.2024.07.030

Citation:
Researchers develop precise pricing formula for perpetual American strangle options (2024, September 16)
retrieved 16 September 2024
from https://phys.org/news/2024-09-precise-pricing-formula-perpetual-american.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Computational marathon matches the efficiency of existing platform with the power of new supercomputer

0
Computational marathon matches the efficiency of existing platform with the power of new supercomputer


Computational marathon matches the efficiency of the AiiDA platform with the power of Switzerland Alps supercomputer
After a ramp-up phase of 2.5 hours (with the machine still shared with other users), AiiDA workflows filled all available nodes, with a sustained 99.96% machine utilization for over 18 hours. In total 14,945,009 SCF iterations were executed performing 944,428 ionic steps, executed as part of 99,225 DFT code runs (using SIRIUS-enabled Quantum ESPRESSO), resulting in the crystal-structure relaxation of 19,829 compounds. Credit: NCCR MARVEL

It took about 20 hours and a lot of coffee for a team of scientists from the Swiss National Center of Competence in Research NCCR MARVEL to complete a computational marathon that showcased both the power of Switzerland’s main supercomputing facility, and the level of maturity achieved by Swiss-made software tools for computational materials science.

The Alps supercomputer, which just became operational with its official inauguration on September 14, 2024, is one of the world’s most powerful supercomputers. It is managed by the Swiss National Supercomputing Center (CSCS) and it consists of a geo-distributed infrastructure mainly located in the Lugano data center.

During the acceptance phase, CSCS allowed access to Alps to selected research groups, and among the first with this opportunity were members of the NCCR MARVEL, specifically Giovanni Pizzi’s group, part of the Laboratory for Materials Simulation (LMS) at PSI headed by Nicola Marzari, that uses computational methods to look for new materials for many applications.

Over the course of one day and one night on July 17 and 18, a team including Marnik Bercx, Michail Minotakis and Timo Reents, all from Pizzi’s group, embarked on what computational specialists call a “hero run”—a time slot when a supercomputing machine is entirely reserved for a single user, to use the full power of the entire machine to advance their own research, and demonstrate their capability of efficiently exploiting the immense computational power of the full system.

The PSI group wanted to match the power of the Alps supercomputer with AiiDA, an open-source tool that helps materials scientists automate the long and complex calculations required to simulate the properties of materials—either existing ones or those still waiting to be discovered.

In particular, they interfaced AiiDA and Alps to run high-throughput calculations, where thousands of different materials structures stored in a database are calculated in parallel. It is the kind of computational experiment that allows, for example, the selection of potential new battery materials out of thousands of known chemical compounds, helping experimentalists to focus their efforts on the most promising ones.

“We wanted to show that AiiDA can fill up all the nodes of a supercomputer with near-exascale performance for many hours and fully exploit the power of the machine while handling, running and maintaining many separate workflows simultaneously, which is necessary for high-throughput computations,” explains Bercx.

The run was managed remotely, with the AiiDA software installed on a PSI server, and used to prepare all input files of the calculations to be performed. The actual computations were executed using an enhanced version of the widely used Quantum ESPRESSO computed code for materials simulations, powered by the Sirius library—developed within NCCR MARVEL at CSCS—that allows for the optimal exploitation of the great computing power provided by graphical processing units (GPUs) of Alps, and implements novel algorithms to significantly improve the simulation success rate.

When the scientists got the green light from the CSCS staff around noon on the chosen date, they started sending input files to the Alps machine, where they were submitted to a scheduling software that distributed the jobs among the 2033 NVIDIA Grace Hopper nodes (including 8,132 GPUs and 585,504 CPU cores) that were granted for the hero run and queued them. On the other side of the connection, AiiDA was monitoring each job so that once it was finished, the files could be retrieved, parsed, and stored in AiiDA, and new calculations could be then submitted.

Very quickly after starting the run, AiiDA could fill the whole Alps supercomputer with jobs, fully exploiting its outstanding computational capabilities. Around 3 AM, the team understandably needed a short nap, and relied on AiiDA to continue preparing and submitting new jobs in their absence. The run successfully ended around 9 AM on the second day.

“All went smoothly, and the number of available nodes was remarkably stable during the entire hero run, which speaks to the quality of the infrastructure” says Bercx. The 99.96% utilization of a near-exascale machine is utterly remarkable and quite unprecedented—very much achieving the goals of the MARVEL NCCR dedicated to computational materials discovery enabled by such capabilities and infrastructure.

In the end, the team managed to complete almost 100,000 calculations, corresponding to single runs of Quantum ESPRESSO, in just about 16 hours. More specifically, the calculations were about the properties of around 20,000 crystal structures taken from the AiiDA database.

“We chose medium-sized structures, because Alps is so powerful that small structures would not use the computational power efficiently,” explains Minotakis. “We started with structures made out of 40 atoms, and then in subsequent submissions added slightly smaller and slightly larger structures.”

The computations were meant to calculate the electronic properties of the materials in their ground state, find whether they were magnetic or not, and calculate their ground-state geometric configuration.

“We also had new pseudopotentials that we wanted to test, so we updated the calculations for a large fraction of the structures in the database and checked the differences with previous calculations” says Reents. All the results will soon be published as FAIR and open data, and uploaded to the Materials Cloud, the online data sharing platform of NCCR MARVEL, to expand the MC3D database of inorganic 3D crystal structures.

In addition to the great scientific value of these simulations, the run demonstrated the efficiency and stability of AiiDA, which could seamlessly fill the entire capacity of an exascale machine.

“The performance of the new Alps machine is outstanding, even more so when combined with the high-throughput capabilities of AiiDA. It is impressive that we could compress in less than a day the equivalent computing power granted for one full year to large supercomputing projects at CSCS, equivalent to approximately 800,000 GPU hours of computation on the previous-generation CSCS supercomputer Daint,” says Pizzi.

Provided by
National Centre of Competence in Research (NCCR) MARVEL

Citation:
Computational marathon matches the efficiency of existing platform with the power of new supercomputer (2024, September 16)
retrieved 16 September 2024
from https://techxplore.com/news/2024-09-marathon-efficiency-platform-power-supercomputer.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Study finds mine-drainage treatment cost effective, but far more costs lay ahead

0
Study finds mine-drainage treatment cost effective, but far more costs lay ahead


Study finds mine-drainage treatment cost effective, but far more costs lay ahead
Abandoned mine drainage impaired streams with Datashed systems. Source: PA Department of Environmental Protection and Datashed.org. Credit: Communications Earth & Environment (2024). DOI: 10.1038/s43247-024-01669-0

New research led by the University of Pittsburgh shows that state and federal appropriations allowing Pennsylvania to treat abandoned mine drainage works to both successfully and cost effectively clean up the acidic water—particularly to the benefit of affected vulnerable communities. But their research also shows that current appropriations to the state are insufficient for long-term treatment of all mine drainage while also needing to address other abandoned mine hazards such as sinkholes.

“In the past 35 years, the Pennsylvania Department of Environmental Protection and numerous watershed groups have built more than 300 systems using state and federal funding to treat mine drainage before it enters nearby streams,” said Jeremy Weber, a professor in Pitt’s Graduate School of Public & International Affairs.

Working with Kenyon College’s Katie Jo Black, Weber co-authored the research published in Communications Earth & Environment.

“Data from existing treatment systems shows that they protect more than 1,000 miles, or 1,500 kilometers, of streams and rivers from impairment by mine drainage. The systems have been relatively cost-effective, protecting streams for $5,700 per kilometer per year. However, the state has nearly 5,600 miles, or 9,000 kilometers, of impaired streams and rivers remaining.”

Weber, who researches the policy and economics of environmental and energy issues, noted how the transition from coal to newer, greener forms of energy has spawned funding toward the cleanup of legacy pollution, job retraining for displaced workers and other economic-development grants. Specifically, the 2021 Infrastructure and Investment and Jobs Act (IIJA) appropriated $16 billion to clean up abandoned wells and mines, where drainage can “stymie local economies,” the researchers said.

“Recent U.S. legislation provides a historic appropriation for addressing abandoned mine hazards such as acidic drainage that can turn a stream orange, kill its fish, and sicken people if they ingest it,” Weber said. “Who the investment will benefit and what it will accomplish has been unclear.”

To attempt to find answers, their research focused on Pennsylvania, which the co-authors said contains the most abandoned mine liabilities in the United States and is estimated to receive roughly one-third of the mine-funding prescribed in the 2021 IIJA.

They found specific communities vulnerable to the deleterious effects.

“Pennsylvania communities most exposed to mine drainage have incomes 30% below that of unaffected communities and are twice as vulnerable to the energy transition,” Weber said.

Some 2.4 million people, or roughly 18.5% of Pennsylvania’s total population, live in a community (or Census tract) with a stream impaired by abandoned mine drainage, the co-authors said. In many instances, the impairment is extensive, with 500,000 people living in a community where at least half of their streams are impaired. The researchers found that those communities most affected by mine drainage are also much less prosperous than unaffected communities, with household incomes about 30% lower and housing values 50% lower.

Using data to study 265 systems, the co-authors observed outflow water that is substantially better in quality than inflow water—illustrating that the systems are improving the quality of mine drainage on average before it enters nearby streams. For instance, inflow water had a 4.3 pH average, “roughly the acidity of tomato juice,” they wrote, where outflow averaged near 6 pH.

They estimated that nearly 6,500 miles (10,400 kilometers) of streams need protection from abandoned mine drainage, including nearly 1,000 miles, or 1,500 kilometers, with existing, aging systems. While cost-effective toward surface-water quality currently, the funding they computed for the next 25 years means that Pennsylvania will require $1.5 billion to repair what’s ahead, but will need another $3.9 billion to address liabilities that aren’t related to mine-drainage issues like sinkholes, highwalls, and opened mine shafts.

That means, the co-authors wrote, “the funds are less than half the amount needed.”

More information:
Katie Jo Black et al, Treating abandoned mine drainage can protect streams cost effectively and benefit vulnerable communities, Communications Earth & Environment (2024). DOI: 10.1038/s43247-024-01669-0

Citation:
Study finds mine-drainage treatment cost effective, but far more costs lay ahead (2024, September 16)
retrieved 16 September 2024
from https://phys.org/news/2024-09-drainage-treatment-effective-lay.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link