In 1738, inventor Jacques de Vaucanson allegedly created a mechanical duck that was able to quack, bathe, drink water and eat grain, and also had the ability to digest and void. In the centuries that followed, robots have been employed for more serious tasks and have mainly performed mechanical functions—but now rapidly changing technology has accelerated the action—and the results and possibilities are becoming more interesting and thoughtful. Once only found in science fiction, robots now can talk, “think,” and make decisions, in addition to performing useful tasks, such as augmenting human biomechanical functions and manufacturing productivity.
Could robotics and AI redefine higher education?
Alan Wagner, assistant professor of aerospace engineering at Penn State, is embarking on a study to determine the effects social robotics could contribute to the social development of students and how they could be used to support academic integrity decision-making. Funded by a recently awarded fellowship from Penn State Teaching and Learning with Technology, Wagner plans to develop an experimental framework, using immersive social spaces, to investigate how social, artificially intelligent robots in higher education can foster and enforce ethical behavior and academic integrity with students. He will also seek to understand what the long-term ramifications of these systems are on students.
“Contemporary robots are very limited; they cannot do many of the things that most people take for granted,” said Wagner, in an article on the Penn State website. “They have no common sense, previous experience or understanding of social norms without being explicitly programmed with this information. For these reasons, the use of social robots demands simplistic, well-structured and controlled situations.”
Wagner is currently developing methods that will allow nontechnical people, such as students, to work and interact with a robot, will develop a series of software programs and algorithms that will allow a robot to play different interactive games—such as Connect 4, Uno and Checkers, that have pre-determined, well-defined rules—with a person. “My goal is to develop robots that teach students how to play games with empathy, integrity and social awareness,” said Wagner.
Robot arm assists with total knee replacement surgery
Orthopedics & Joint Replacement at Mercy Medical Center in Baltimore, MD, now offers minimally invasive MAKOplasy total knee replacement surgery that uses robotic technology.
The process begins with a CT scan of the patient’s knee joint, which is used to generate a 3D virtual model of the patient’s anatomy. This model is then uploaded onto the MAKOplasty system software and is used to create a pre-operative plan, specific to that patient.
Robotic technology improves accuracy, allows the surgeon to make adjustments for muscular and soft tissue alignment, and yields better outcomes for patients, according to Dr. Marc Hungerford, Chief of the Division of Orthopedics at Mercy.
“This advanced technology transforms the way joint replacement surgery is performed, enabling surgeons to more accurately position a patient’s joint replacement. The result is a better, and longer-performing joint, as well as a faster recovery after surgery,” Hungerford said.
The MAKOplasty robotic-arm is guided by the surgeon to remove diseased bone and cartilage and then inserts the knee replacement. During the procedure, the surgeon can make any necessary adjustments while guiding the robotic-arm.
Emotional robot expresses its ‘feelings’ with its skin
Cornell University researchers reported they have developed a robot prototype that can express “emotions” through changes in its outer surface. The robot’s skin covers a grid of texture units whose shapes change based on the robot’s feelings.
Assistant professor of mechanical and aerospace engineering Guy Hoffman said that he was inspired by the animal world when creating a robot that expresses nonverbal cues through its outer skin.
“I’ve always felt that robots shouldn’t just be modeled after humans or be copies of humans,” Hoffman said in the Cornell Chronicle. “We have a lot of interesting relationships with other species. Robots could be thought of as one of those ‘other species,’ not trying to copy what we do but interacting with us with their own language, tapping into our own instincts.”
Their work is detailed in a paper, “Soft Skin Texture Modulation for Social Robots,” presented at the International Conference on Soft Robotics in Livorno, Italy. Doctoral student Yuhan Hu was lead author; the paper was featured in IEEE Spectrum, a publication of the Institute of Electrical and Electronics Engineers.
Hoffman and Hu’s design features an array of two shapes, goosebumps and spikes, which map to different emotional states. The actuation units for both shapes are integrated into texture modules, with fluidic chambers connecting bumps of the same kind.