A Robot Arm Can Move And Detect Objects By Touch (+VIDEO)



future, future technology, robotics, future of robots, future robots, Georgia Tech, robot arm, innovation in robotics, tech news, futuristic
The Georgia Tech group has recently presented a robot arm that moves and finds objects by touch. According to the designers it can reach into a cluttered environment and use “touch,” along with computer vision, to do various tasks. The team believes such robots will move freely in human environments and the technology might be employed in hospital or rehabilitation settings for patient care (e.g. to help in patient and elder care or rescue missions during emergencies). The robot arm can reach and then use software to control its sense of touch, making it possible to find specific objects in a collection or area. The arm seems to mimic human behavior, thus the robot was able to bend, compress and slide objects. The robot also has an artificial “skin” that can sense pressure or touch. The robot’s arms were designed by Meka Robotics; the software is based on the Willow Garage Robot Operating System (ROS), which is meant to be shared freely. Hoping that other robot makers will improve and advance the robot, the team have made their software open source as well, and shared instructions to make and adapt low-cost robot skin,. You can watch a robot arm wiping the mouth of a disabled man and adjusting a blanket in a video below.
Via:nytimes.com

Donate BTC: 1DUUZbiqjbJvzjuRZEwTS47bqgmnhEJ4XG

Donate ETH: 0x981FcEAAa895C6cee76D2876e9AfC649Dc0C4c75

More Posts:

Football Theme Park On An Artificial Archipelago In The UAE
oRo: Human Powered, Zero Emission Vehicle
Modular Summer Pavilion: Ecological Concept Building
A New Breathing Sensor System Will Prevent Sudden Infant Death Syndrome
TITAN THE ROBOT (Pictures and Video)
A Glasses-Free 3D LCD Prototype (+VIDEO)
2025 Technologies
Michio Kaku: How to Reverse Aging
Introducing HARMONY, an Advanced Rehabilitation Robot
World's First Smartphone With Google’s Augmented Reality Is Here