Tuesday, December 24, 2024
Home Blog Page 1383

A fully edible robot could soon end up on our plate, say scientists

0
A fully edible robot could soon end up on our plate, say scientists


A fully edible robot could soon end up on our plate, say scientists
Artistic rendering of a future edible robot. Credit: Nature Reviews Materials (2024). DOI: 10.1038/s41578-024-00688-9

A fully edible robot could soon end up on our plate if we overcome some technical hurdles, say EPFL scientists involved in RoboFood—a project which aims to marry robots and food.

Robots and food have long been distant worlds: Robots are inorganic, bulky, and non-disposable; food is organic, soft, and biodegradable. Yet, research that develops edible robots has progressed recently and promises positive impacts: Robotic food could reduce electronic waste, help deliver nutrition and medicines to people and animals in need, monitor health, and even pave the way to novel gastronomical experiences.

But how far are we from having a fully edible robot for lunch or dessert? And what are the challenges? Scientists from the RoboFood project, based at EPFL, address these and other questions in a perspective article in the journal Nature Reviews Materials.

“Bringing robots and food together is a fascinating challenge,” says Dario Floreano, director of the Laboratory of Intelligent Systems at EPFL and first author of the article. In 2021, Floreano joined forces with Remko Boom from Wageningen University, The Netherlands, Jonathan Rossiter from the University of Bristol, UK, and Mario Caironi from the Italian Institute of Technology, to launch the project RoboFood.

In the perspective article, RoboFood authors analyze which edible ingredients can be used to make edible robot parts and whole robots, and discuss the challenges of making them.

“We are still figuring out which edible materials work similarly to non-edible ones,” says Floreano. For example, gelatin can replace rubber, rice cookies are akin to foam, a chocolate film can protect robots in humid environments, and mixing starch and tannin can mimic commercial glues.

Robots au chocolat for dessert?
Credit: Ecole Polytechnique Federale de Lausanne

These and other edible materials make up the ingredients of robotic components. “There is a lot of research on single edible components like actuators, sensors, and batteries,” says Bokeon Kwak, a postdoc in the group of Floreano and one of the authors.

In 2017, EPFL scientists successfully produced an edible gripper, a gelatin-made structure that could handle an apple and be eaten afterward. EPFL, IIT, and the University of Bristol recently developed a new conductive ink that can be sprayed on food to sense its growth. The ink contains activated carbon as a conductor, while Haribo gummy bears are used as a binder. Other sensors can perceive pH, light, and bending.

In 2023, IIT researchers realized the first rechargeable edible battery using riboflavin (vitamin B2) and quercetin (found in almonds and capers) in the battery poles, adding activated carbon to facilitate electron transport and nori algae, used to wrap sushi, to prevent short circuits. Packaged with beeswax, the 4 cm wide edible battery can operate at 0.65 volts, still a safe voltage in case of ingestion; two edible batteries connected in series can power a light-emitting diode for about 10 minutes.

Once the components are ready, the goal is to produce fully edible robots. To date, scientists have succeeded in assembling partially edible robotic systems.

In 2022, researchers from EPFL and the Wageningen University designed a drone with wings out of rice cookies glued with gelatin. Scientists at EPFL and IIT have also created a partially edible rolling robot that uses pneumatic gelatin legs and an edible tilt sensor.

Before writing the recipe for fully edible robots, researchers face several challenges. One of them is the lack of understanding of how humans and animals perceive processed food with reactive and autonomous behavior. Also, fully edible electronics that use transistors and process information are still difficult to make.

“But the biggest technical challenge is putting together the parts that use electricity to function, like batteries and sensors, with those that use fluids and pressure to move, like actuators,” says Kwak. After integrating all components, scientists need to miniaturize them, increase the shelf life of robotic food… and give robots a pleasant taste.

More information:
Dario Floreano et al, Towards edible robots and robotic food, Nature Reviews Materials (2024). DOI: 10.1038/s41578-024-00688-9

Citation:
A fully edible robot could soon end up on our plate, say scientists (2024, June 14)
retrieved 24 June 2024
from https://techxplore.com/news/2024-06-fully-edible-robot-plate-scientists.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Researchers create more precise 3D reconstructions using only two camera perspectives

0
Researchers create more precise 3D reconstructions using only two camera perspectives


From two images to a 3D object
Fields of application for 3D reconstructions include autonomous driving and monument conservation. Credit: Technical University Munich

In recent years, neural methods have become widespread in camera-based reconstructions. In most cases, however, hundreds of camera perspectives are needed. Meanwhile, conventional photometric methods exist which can compute highly precise reconstructions even from objects with textureless surfaces. However, these typically work only under controlled lab conditions.

Daniel Cremers, professor of Computer Vision and Artificial Intelligence at TUM and leader of the Munich Center for Machine Learning (MCML) and a director of the Munich Data Science Institute (MDSI) has developed a method together with his team that utilizes the two approaches.

It combines a neural network of the surface with a precise model of the illumination process that considers the light absorption and the distance between the object and the light source. The brightness in the images is used to determine the angle and distance of the surface relative to the light source.

“That enables us to model the objects with much greater precision than existing processes. We can use the natural surroundings and can reconstruct relatively textureless objects for our reconstructions,” says Cremers.

The paper is published on the arXiv preprint server and will be presented at the Conference on Computer Vision and Pattern Recognition (CVPR 2024) held in Seattle from June 17 to June 21, 2024.

Applications in autonomous driving and preservation of historical artifacts

The method can be used to preserve historical monuments or digitize museum exhibits. If these are destroyed or decay over time, photographic images can be used to reconstruct the originals and create authentic replicas.

The team of Prof. Cremers also develops neural camera-based reconstruction methods for autonomous driving, where a camera films the vehicle’s surroundings. The autonomous car can model its surroundings in real-time, develop a three-dimensional representation of the scene, and use it to make decisions.

The process is based on neural networks that predict 3D point clouds for individual video images that are then merged into a large-scale model of the roads traveled.

More information:
Mohammed Brahimi et al, Sparse Views, Near Light: A Practical Paradigm for Uncalibrated Point-light Photometric Stereo, arXiv (2024). DOI: 10.48550/arxiv.2404.00098

Journal information:
arXiv


Citation:
Researchers create more precise 3D reconstructions using only two camera perspectives (2024, June 20)
retrieved 24 June 2024
from https://techxplore.com/news/2024-06-precise-3d-reconstructions-camera-perspectives.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Turing test study shows humans rate artificial intelligence as more ‘moral’ than other people

0
Turing test study shows humans rate artificial intelligence as more 'moral' than other people


human-AI
Credit: Pixabay/CC0 Public Domain

A new study has found that when people are presented with two answers to an ethical question, most will think the answer from artificial intelligence (AI) is better than the response from another person.

“Attributions Toward Artificial Agents in a Modified Moral Turing Test,” a study conducted by Eyal Aharoni, an associate professor in Georgia State’s Psychology Department, was inspired by the explosion of ChatGPT and similar AI large language models (LLMs) which came onto the scene last March.

“I was already interested in moral decision-making in the legal system, but I wondered if ChatGPT and other LLMs could have something to say about that,” Aharoni said. “People will interact with these tools in ways that have moral implications, like the environmental implications of asking for a list of recommendations for a new car. Some lawyers have already begun consulting these technologies for their cases, for better or for worse.”

“So, if we want to use these tools, we should understand how they operate, their limitations and that they’re not necessarily operating in the way we think when we’re interacting with them.”

To test how AI handles issues of morality, Aharoni designed a form of a Turing test.

“Alan Turing, one of the creators of the computer, predicted that by the year 2000, computers might pass a test where you present an ordinary human with two interactants, one human and the other a computer, but they’re both hidden and their only way of communicating is through text. Then the human is free to ask whatever questions they want to in order to try to get the information they need to decide which of the two interactants is human and which is the computer,” Aharoni said.

“If the human can’t tell the difference, then, by all intents and purposes, the computer should be called intelligent, in Turing’s view.”

For his Turing test, Aharoni asked undergraduate students and AI the same ethical questions and then presented their written answers to participants in the study. They were then asked to rate the answers for various traits, including virtuousness, intelligence and trustworthiness.

“Instead of asking the participants to guess if the source was human or AI, we just presented the two sets of evaluations side by side, and we just let people assume that they were both from people,” Aharoni said. “Under that false assumption, they judged the answers’ attributes like ‘How much do you agree with this response, which response is more virtuous?'”

Overwhelmingly, the ChatGPT-generated responses were rated more highly than the human-generated ones.

“After we got those results, we did the big reveal and told the participants that one of the answers was generated by a human and the other by a computer and asked them to guess which was which,” Aharoni said.

For an AI to pass the Turing test, humans must not be able to tell the difference between AI responses and human ones. In this case, people could tell the difference, but not for an obvious reason.

“The twist is that the reason people could tell the difference appears to be because they rated ChatGPT’s responses as superior,” Aharoni said. “If we had done this study five to 10 years ago, then we might have predicted that people could identify the AI because of how inferior its responses were. But we found the opposite—that the AI, in a sense, performed too well.”

According to Aharoni, this finding has interesting implications for the future of humans and AI.

“Our findings lead us to believe that a computer could technically pass a moral Turing test—that it could fool us in its moral reasoning. Because of this, we need to try to understand its role in our society because there will be times when people don’t know that they’re interacting with a computer, and there will be times when they do know and they will consult the computer for information because they trust it more than other people,” Aharoni said.

“People are going to rely on this technology more and more, and the more we rely on it, the greater the risk becomes over time.”

The findings are published in the journal Scientific Reports.

More information:
Eyal Aharoni et al, Attributions toward artificial agents in a modified Moral Turing Test, Scientific Reports (2024). DOI: 10.1038/s41598-024-58087-7

Citation:
Turing test study shows humans rate artificial intelligence as more ‘moral’ than other people (2024, May 6)
retrieved 24 June 2024
from https://techxplore.com/news/2024-05-turing-humans-artificial-intelligence-moral.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

European tech must keep pace with US, China: Meta’s Clegg

0
European tech must keep pace with US, China: Meta's Clegg


innovation
Credit: CC0 Public Domain

Europe is lagging behind both the United States and China when it comes to technology and innovation, top executive with US firm Meta Nick Clegg has told AFP.

Clegg, president of global affairs at the parent company of Facebook, Instagram and WhatsApp, said Europe had a “real problem”.

“We are falling very rapidly behind the US and China,” said Clegg, who was promoting a scheme to mentor startups on the continent.

“I think for too long, the view has been that Europe’s only role is to regulate. And then China imitates and America innovates.”

But he argued it was not possible to “build success on the back of a law”.

“You build success on the back of innovation, entrepreneurship, and a partnership between big tech companies and small startups.”

Clegg was promoting a scheme run by Meta and two French companies to offer five European startups six months of mentoring and access to their facilities.

Clegg has spearheaded previous efforts by Meta to invest in tech in Europe, announcing in 2021 that the US firm would create 10,000 jobs there to help build the “metaverse”.

Meta burnt through billions of dollars trying to make its metaverse project a reality but has since changed focus to artificial intelligence and announced thousands of layoffs, including in the teams working on the metaverse.

© 2024 AFP

Citation:
European tech must keep pace with US, China: Meta’s Clegg (2024, June 24)
retrieved 24 June 2024
from https://techxplore.com/news/2024-06-european-tech-pace-china-meta.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link

Using drones will advance the inspection of remote runways in Canada and beyond, research suggests

0
Using drones will advance the inspection of remote runways in Canada and beyond, research suggests


Using drones will revolutionize the inspection of remote runways in Canada and beyond, research suggests
Visualized results overlaying a satellite map. The bright green contours highlight all detected targets, including runways, vegetation, water, and rough surfaces. They are distinguished by various filling colors, e.g., runway as purple, water as blue, vegetation as green, and rough surfaces as red. Credit: Drones (2024). DOI: 10.3390/drones8060225

With weather, limited flights and long distances, gravel runways at remote airports—particularly in northern Canada—are difficult to get to, let alone to inspect for safety.

So Northeastern University researcher Michal Aibin and his team have developed a more thorough, safer and faster way to inspect such runways using drones, computer vision and artificial intelligence. The work has been published in the journal Drones.

“Basically, what you do is you start the drone, you collect the data and—with coffee in your hand—you can inspect the entire runway,” says Aibin, visiting associate teaching professor of computer science at Northeastern’s Vancouver campus.

There are over 100 airports in Canada that are considered remote, Aibin says, meaning that they have no road or standard means of transportation leading to them. Thus, nearby communities’ food, medicine and other supplies all come by air.

The airports also predominantly feature gravel rather than asphalt runways, making them particularly susceptible to the elements.

But safety inspections are difficult. Engineers who inspect the remote airports must schedule a long flight, often during a narrow window of time dependent on the seasons, weather conditions and more.

A new, more reliable and less time-consuming method was needed.

So, Aibin worked with Northeastern associate teaching professor Lino Coria and student researchers to identify several types of defects for gravel runways, such as surface water pooling, encroaching vegetation, and smoothness defects like frost heaves, potholes and random large rocks.

Collaborating with Transport Canada (the Canadian government’s department of transportation) and Spexi Geospatial Inc., the researchers used computer vision and artificial intelligence to analyze drone images of remote runways in order to detect, characterize and classify defects.

“Our biggest novelty is we take all the images of the runway and we assess all the defects—like there’s some rocks, there’s maybe a hole, there’s maybe some aspects that are not initially visible to the human eye,” Aibin says.

The result is a new procedure for inspecting airport runways using high-resolution photos taken from remote-controlled, commercially available drones and high-powered computing. The new method proved effective when demonstrated at several remote airports, Aibin says.

The process doesn’t totally eliminate humans—a person must fly the drone and evaluate the computer analysis, Aibin notes (although those tasks can be done remotely). But Aibin says the method saves time, reduces the need for inspectors on site, and makes inspecting a remote gravel runway a much less onerous task.

Aibin says that the next step is providing more real-world applications to test the new method. But he sees the method being expanded beyond remote Canada into other remote sections of the world such as in Australia and New Zealand.

“The need to fly an engineer to the site is no longer needed, which was the ultimate goal,” Aibin says. “As long as someone can fly a drone and take images, then it can be sent in the form of a report to speed up the process.”

More information:
Zhiyuan Yang et al, Next-Gen Remote Airport Maintenance: UAV-Guided Inspection and Maintenance Using Computer Vision, Drones (2024). DOI: 10.3390/drones8060225

This story is republished courtesy of Northeastern Global News news.northeastern.edu.

Citation:
Using drones will advance the inspection of remote runways in Canada and beyond, research suggests (2024, June 14)
retrieved 24 June 2024
from https://techxplore.com/news/2024-06-drones-advance-remote-runways-canada.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





Source link