[ | E-mail | Share ]
Contact: Lisa-Joy Zgorski
lisajoy@nsf.gov
703-292-8311
National Science Foundation
Search-and-rescue, planet exploration, home health care and drug delivery are potential applications that offer tremendous economic and societal impacts
The National Science Foundation (NSF), in partnership with NASA, the National Institutes of Health (NIH) and the U.S. Department of Agriculture (USDA), today awarded just under $50 million to grantees around the country for the development and use of robots that cooperatively work with people and enhance individual human capabilities, performance and safety.
These mark the first round of awards of the Obama Administration's National Robotics Initiative (NRI) launched with NSF as the lead federal agency just over a year ago as part of the president's Advanced Manufacturing Partnership Initiative. NSF itself is announcing 31 of the awards totaling nearly $30 million.
Funded projects target the creation of next-generation collaborative robots, or co-robots, with applications in areas such as advanced manufacturing; civil and environmental infrastructure; health care and rehabilitation; military and homeland security; space and undersea exploration; food production, processing and distribution; assistive devices for improving independence and quality of life; and safer driving. Examples of co-robots envisioned include the co-worker in manufacturing, the co-protector in civilian and military venues, and the co-inhabitant assistant in the home of an elder living independently.
Last year, NSF issued a solicitation and managed the merit review process for more than 700 individual proposals requesting over $1 billion in funding. NSF's Directorates for Computer and Information Science and Engineering; Engineering; Education and Human Resources; and Social, Behavioral, and Economic Sciences worked collaboratively with the other agencies. Together, NSF, NASA, NIH and USDA participated in the review process and in all funding decisions.
Each agency applied the goals of NRI against its mission criteria: encouraging robotics research and technology development to enhance aeronautics and space missions for NASA; developing robotic applications for surgery, health intervention, prostheses, rehabilitation, behavioral therapy, personalized care and wellness/health promotion for NIH; promoting robotics research, applications, and education to enhance food production, processing and distribution that benefits consumers and rural communities for USDA; and advancing fundamental robotics research across all areas of national priority for NSF, including advanced manufacturing.
"Harnessing the expertise of the private and public sectors, across many disciplines will advance smart technology and ultimately revolutionize key drivers of America's productivity, economic competitiveness and quality of life," said Farnam Jahanian, assistant director of NSF's CISE Directorate.
Each federal agency announced its awards this morning. Later today, representatives from each participating agency will brief the Congressional Robotics Caucus. For details please see the notice on the Congressional Robotics Caucus website.
What follows is the list of the NSF-funded projects and principal investigators leading the research from each participating university. For projects that involve multiple institutions, the lead institution (and PI) is noted with an asterisk.
Collaborative Research: Multilateral Manipulation by Human-Robot Collaborative Systems
*Stanford University (Allison Okamura). University of California - Santa Cruz (Jacob Rosen), Johns Hopkins University (Gregory Hager), University of California - Berkeley (Pieter Abbeel)
This project seeks to emulate the expert-apprentice relationship using human beings and robots. It focuses on developing ways in which robots can learn from human activity in order to help humans by providing more hands, eyes and brain power as necessary, enabling multilateral manipulation from multiple vantage points. Applications in the manufacturing plant or in the operating room are potentially numerous.
Collaborative Research: Purposeful Prediction: Co-robot Interaction via Understanding Intent and Goals
*Carnegie-Mellon University (James Bagnell), Massachusetts Institute of Technology (Joshua Tenenbaum), University of Washington (Dieter Fox)
This project focuses on recognizing human intention--that is, teaching a robot to forecast what a human is going to do, so that robots may more effectively collaborate with humans. The inability of robots to anticipate human needs and goals today represents a fundamental barrier to the large-scale deployment of robots in the home and workplace. This project seeks to develop a new science of purposeful prediction using algorithms that may be applied to human-robot interaction across a wide variety of domains.
Collaborative Research: Soft Compliant Robotic Augmentation for Human-Robot Teams
*Massachusetts Institute of Technology (Daniela Rus), Harvard University (Robert Wood), the University of Colorado at Boulder (Nikolaus Correll)
This research explores the use of soft materials for robots, specifically in the design of soft compliant fingers and hands, so that humans and robots may more effectively coexist in shared environments. Made more affordable, soft manipulators can enable in-home assistants to easily and unobtrusively navigate the natural world of the elderly or incapacitated.
A Design Methodology for Multi-fingered Robotic Hands with Second-order Kinematic Constraints
*Idaho State University (Alba Perez Gracia), the University of California Irvine (J. Michael McCarthy)
This research focuses on the adoption and integration of specific characteristics of human hands in robots in order to accomplish a desired task, whether that entails lifting a small, unusually-shaped part for assembly or moving a bulky object. This tool will increase the ability of industry to design high performance, cost-effective multi-fingered robotic hands and other end effectors.
Collaborative Research: A Dynamic Bayesian Approach to Real-Time Estimation and Filtering in Grasp Acquisition and Other Contact Tasks
*Rensselaer Polytechnic Institute (Jeffrey Trinkle), State University of New York (SUNY) at Albany (Siwei Lyu)
This project is developing techniques to enable robots to grasp objects or perform other contact tasks in unstructured, uncertain environments with speed and reliability. Using the proposed method, sensor data tracks the continuous motions of manipulated objects, while models of the objects are simultaneously updated. Applications include search and rescue, planetary exploration, manufacturing, even home use with every day and important uncertainties such as effectively moving a bowl whether it is full or empty.
Collaborative Research: Addressing Clutter and Uncertainty for Robotic Manipulation in Human Environments
Carnegie-Mellon University (Siddhartha Srinivasa), Northwestern University (Kevin Lynch)
The long-term goal of this project is to develop personal robots that share a workspace with humans. To achieve the goal of personal robots in homes, the robots must adapt to the humans' living space, which can be cluttered and unstructured. The models here are the messy desk or a crowded refrigerator--in both examples, a robot must be able to extract a specific item or complete another task at hand from a landscape that may also be accessed and altered by humans.
Collaborative Research: Assistive Robotics for Grasping and Manipulation using Novel Brain Computer Interfaces
*Columbia University (Peter Allen), University of California, Davis (Sanjay Joshi)
This project aims to make concrete some of the major goals of assistive robotics using the expertise of an assembled team of experts from the fields of signal processing and control, robotic grasping and rehabilitative medicine. The research team works to create a field-deployable assistive robotic system that will allow severely disabled patients to control a robot arm/hand system to perform complex grasping and manipulation tasks using novel brain muscle computer interfaces.
Collaborative Research: Multiple Task Learning from Unstructured Demonstrations
University of Massachusetts Amherst (Andrew Barto)
This research centers on programming by human demonstration. The project is developing techniques for the efficient, incremental learning of complex robotic tasks by breaking unstructured demonstrations into reusable component skills. A simple interface that allows end-users to intuitively program robots is a key step to getting robots out of the laboratory and into human-cooperative settings in the home and workplace.
A Biologically Plausible Architecture for Robotic Vision
University of California-San Diego (Nuno Vasconcelos)
The project seeks to develop a vision architecture for robots based on biologically inspired examples of high level vision systems (for example a gaze or ascertaining what a human is looking at) that is both biologically plausible and jointly optimal. This system would be useful for attention, object tracking, object and action recognition in both static and dynamic environments.
Context-Driven Haptic Inquiry of Objects Based on Task Requirements for Artificial Grasp and Manipulation
Arizona State University (Veronica Santos)
This work focuses on the sense of touch. The project aims to advance artificial manipulators by integrating a new class of multimodal tactile sensors with artificial, humanlike hands and developing inquiry routines based on contextual touch. Weight given to each mode of tactile sensing (force, vibration, temperature) will also be tuned according to the context of the task. The research explores how to make use of this stimulus, in order to enable assistive robots to better grasp, hold and carry objects.
Contextually Grounded Collaborative Discourse for Mediating Shared Basis in Situated Human Robot Dialogue
Michigan State University (Joyce Chai)
This project focuses on human-robot dialogue, bridging the chasm of understanding between human partners and robots that have completely mismatched capabilities in perceiving and reasoning about the environment. This project centers on developing techniques that will support mediating the shared perceptual basis for effective conversation and task completion. With an ability to use what is known to shed light on what is not yet known (that is, using the power of inference-in situations that give clues to meaning), this research could benefit many applications in manufacturing, public safety and healthcare.
Cooperative Underwater Robotic Networks for Discovery & Rescue
University of Connecticut (Chengyu Cao)
This project aims to develop a cooperative underwater robotic network for exploration, discovery and rescue currently undermined by murky underwater conditions in which traditional acoustic, radio communication networks do not work. So called autonomous underwater vehicles offer inherent advantages over manned vehicles in cost and efficiency, specifically they eliminate the need for life support systems and the potential risk of human life while enabling assessment and damage mitigation after an incident under the water's surface, such as an oil spill.
Core Technologies for MRI-powered Robots
Children's Hospital Corporation (Pierre Dupont)
This project aims to produce robots that can not only tolerate an Magnetic Resonance Imaging (MRI) environment, but can use its attributes to do useful things. These include crawling inside body cavities to perform interventions or becoming robotic prosthetic implants. At the millimeter and sub-millimeter scale, groups of MRI-powered robots could swim inside fluid-filled regions of the body to perform targeted therapies, such as drug and cell delivery, or assemble as a sensor network. MRI system environments are typically challenging for robots. With the use of two testbeds at different scales, the project seeks to create a transformative robotic technology that uses MRI systems to power, command and control robots under the guidance and control of a clinician.
Co-Robots for STEM Education in the 21st Century
University of California-Davis (Harry Cheng)
This project studies how to use co-robot systems and math-oriented co-robotics competitions to enhance student engagement, increase student motivation in learning algebra and subsequent science, technology, engineering and mathematics (STEM) subjects, and to pique interest in pursuing STEM related careers and post-secondary study. Using a unique robotics platform, a Lego-like intelligent modular system designed for K-12 education, this project prepares teachers to engage their students with relevant pedagogy that illustrates abstract math concepts with concrete applications using computing and robotics.
Expert-Apprentice Collaboration
Duke University (Carlo Tomasi)
This research centers on an integration of the classic expert-apprentice relationship into robotics. It develops visual feature-based methods that allow robots to teach humans, as well as learn from them by unifying apprenticeship learning. This project hopes to tap the potential that teaching by demonstration or imitation offers as a potentially powerful and practical approach to realizing the promise of large-scale personal robotics in a wide range of applications. It will also simplify and enable non-expert users to program computers.
Improved safety and reliability of robotic systems by faults/anomalies detection from uninterpreted signals of computation graphs
California Institute of Technology (Richard Murray)
This research centers on detecting error conditions--that is, figuring out when things are going wrong, and/or when conditions may have been tampered with or altered by a human. This project addresses the main challenges of designing robots that can operate around humans to create systems that can guarantee safety and effectiveness, while being robust to the nuisances of unstructured environments, from hardware faults to software issues, erroneous calibration and less predictable anomalies, such as tampering and sabotage.
Measuring Unconstrained Grasp Forces Using Fingernail Imaging
University of Utah (Stephen Mascaro)
This project develops the technology for unconstrained, multi-fingered measurement of human grasp forces using a fingernail imaging technique. Human subjects freely choose where to place their fingers on objects, allowing for unconstrained multi-finger grasping. The co-robot then detects the individual finger forces of a human partner by ascertaining blood flow, as measured through color change on a fingernail. A co-robot trained with the appropriate calibration data could recognize and emulate or adapt to a human partner's grasp forces, measured using only vision.
Mixed Human-Robot Teams for Search and Rescue
University of Minnesota-Twin Cities (Maria Gini)
This research explores how to make groups of robots behave to accomplish a common goal. The project aims at increasing the ability to respond to large-scale disasters and manage emergencies by including robots and agents as team-mates of humans in search and rescue teams. The project focuses on large teams of humans and robots that have only incomplete knowledge of the disaster situation while they accomplish the mission to rescue people and prevent fires.
Multifunctional Electroactive Polymers for Muscle-Like Actuation
University of California-Los Angeles (Qibing Pei)
This project aims to develop a new, softer polymer material that is electronically stimulated to behave like an artificial muscle. This offers a combination of attributes for future robotic systems including power output that outperforms human skeletal muscle, flexibility, quietness, and biocompatibility. Actuators based on the more human muscle-like material enable the design of robotic systems that more comfortably interact with people, such as assistive prosthesis or assistive devices for people with disabilities, humanoid robots for elderly in-home care, and surgical robots to save lives.
Multi-modal sensor skin and garments for healthcare and home robots
University of Texas at Arlington (Dan Popa)
This research seeks to build skin material that functions like human skin to give robots a learned sense of touch similar to that of humans. The objective of this research is to answer fundamental design questions for multi-functional robotic skin sensors and to optimize their placement onto assistive robotic devices. The aim is to teach the robot how to use the skin sensors efficiently, and quantitatively assess the impact of this assistive technology on humans. The research may unlock near-term and unforeseen applications of robotic skin with broad applicability, and especially to home assistance, medical rehabilitation and prosthetics.
Perceptually Inspired Dynamics for Robot Arm Motion
University of Wisconsin-Madison (Michael Gleicher)
This project seeks to enable a computer to learn from its own trials, errors and successes how to move and how to plan future appropriate motions. Researchers are working to develop an understanding of human perception of movement that can be applied to the development of robot trajectory planning and control algorithms, using human subjects experiments to understand and evaluate the interpretation of movements and apply these findings in robotics and motion synthesis.
Robot Assistants for Promoting Crawling and Walking in Children at Risk of Cerebral Palsy
University of Oklahoma Norman Campus (Andrew Fagg)
This research is developing effective robotic assistance tools to teach infants with or at risk of developing cerebral palsy how to walk and move, mitigating future deficits in cognitive development, that are considered neural side effects of this disease. This project will develop and test a sequence of robotic assistants that promote early crawling, creeping, and walking, along with a model of infant-robot interaction that encourages the continued practice of movement patterns that will ultimately lead to unassisted locomotion. The robotic assistants to be developed in this project will aid the infant in developing locomotory skills by selectively supporting a portion of his/her weight and providing artificial, rewarding locomotory experiences.
Robot Movement for Patient Improvement - Therapeutic Rehabilitation for Children with Disabilities
Georgia Tech Research Corporation (Ayanna Howard)
This research is focused on developing state-of-the-art techniques to facilitate the interaction necessary for robots to be useful for therapeutic rehabilitation with respect to children. Based on the logic that animate playthings naturally engage children, the goal of this project is to fuse play and rehabilitation techniques using a robotic design to induce child-robot interaction that will be entertaining as well as effective for pediatric rehabilitation. Of importance within this proposed work are approaches that allow therapists to provide instruction to robots on rehabilitation tasks that can be remapped to play behaviors specific to an individual child. In addition, robots must have internal perception and inference algorithms that allow them to learn new play behaviors and incorporate them to evoke corresponding behaviors in the child.
Robotic Treadmill Therapy for Lower Spinal Cord Injuries
University of Utah (John Hollerbach)
This project is developing new rehabilitation therapies for patients with incomplete lower spinal cord injuries, specifically the use of a body-weight assisted robotic treadmill that provides a realistic walking experience in a safe and flexible virtual environment. The "Treadport" overcomes limitations of current rehabilitation treadmills, which are too dissimilar from everyday walking and therefore limit a patient's recovery. It works with the patient's walking speed and effort, resistance to falling by strengthening and training a patient to unexpected perturbations and arm swing coordination, critical to normal walking.
Robust, highly-mobile MEMS micro-robots based on integration of piezoelectric and polymer materials
University of Michigan Ann Arbor (Kenn Oldham)
This project focuses on developing tiny, millimeter or sub-millimeter scale robots (smaller than fleas) whose skeletal system is composed of crystal and ceramic, which makes them highly maneuverable with stronger, mini muscles. Prototypes will be developed, tested and perfected. These micro-robots could be eventually used to get into hard-to-reach places, and to crawl around to observe things up close or to complete a task. Applications range from exploration to surveillance, from observation to micro-surgery.
Spacial Primitives for Enabling Situated Human-Robot Interaction
University of Southern California (Maja Mataric)
This research centers on the creation of "social robotics"--that is, robots that interact in appropriate social ways as understood by humans, which may include deference to personal or social space, appropriate gestures and the use of verbal and physical comments. This project focuses on answering the question: how do social (speech and gesture), environmental (loud noises and low lighting), and personal (hearing and visual impairments) factors influence positioning and communication between humans and co-robots, and how should a co-robot adjust its social behaviors to maximize human perception of its social signals?
The Intelligent Workcell - Enabling Robots and People to Work Together Safely in Manufacturing Environments
Carnegie-Mellon University (Paul Rybski)
This research will develop an "Intelligent Workcell," to enable people and industrial robots to work safely and more efficiently within the same workspace. New capabilities in robotic workcell monitoring will likely result. Smart work environments know where you are and what you need, and what you're doing to avoid hindrance and to support assistance.
Virtualized Welding: A New Paradigm for Intelligent Welding Robots in Unstructured Environment
University of Kentucky Research Foundation (Ruigang Yang)
Zeroing in on welding done with widespread use as a manufacturing component and done by highly skilled workers, this project will develop a new robotic platform with novel 3D modeling and visualization algorithms designed to complement the skills and expertise of a human welder with advanced sensing tools of a robotic one. The primary use for this new technology is in manufacturing. Successful completion of the proposed project paves the foundation for intelligent welding robots with closed-loop intelligent control. Such a robotic system can perform high-speed and high-precision welding while allowing more variations in the work pieces and environments. In addition, virtualized welding can be integrated with a mobile platform to allow welding in places that are hazardous or unsuitable for human welders.
Human-Robot Collectives as a Curriculum-Wide CS Learning Platform
Rochester Institute of Technology (Zack Butler)
This project is an effort to re-conceptualize what it means to study computer science at the undergraduate level. The project team will design a sequence of computer science courses that integrate the use of a network of robots to facilitate student learning. In this project, the co-robot teams share space and tasks with humans and are used as a teaching platform in an introductory context, and later, as a laboratory platform for projects in intermediate and upper-level courses in which students can develop and even invent new services. This approach enhances a traditional approach to teaching computer science and provides ample opportunities for students to design, test, and evaluate using co-robot systems.
Managing Uncertainty in Human-Robot Cooperative Systems
Johns Hopkins University (Peter Kazanzides)
This research aims to capitalize on the distinct yet different strengths of humans (perception and reasoning) and machines (precision, accuracy and repetitiveness in information fusion, task planning and simulation) to design truly cooperative systems, managing uncertainty to achieve successful human-robot partnerships to perform complex tasks in uncertain environments. It will build manufacturing and medical testbeds (for minimally invasive surgery during which slight variations such as tremors, twitches or breaths can affect conditions) on which cooperative skills will be applied and tested.
A Novel Light-weight Cable-driven Active Leg Exoskeleton (C-ALEX) for Training of Human Gait
University of Delaware (Sunil K. Agrawal)
This research intends to improve gait training rehabilitation for stroke patients. It will help victims of stroke learn how to walk again. Unlike "clamp robots" that affix to body parts to bolster a patient's muscle strength, this robotic machine is more of a lightweight, puppet system that reduces muscle burden. The person engaged in rehabilitation does not feel anything that is unlike their own limbs (no artificial encumbrance, nothing bulky or uncomfortable). This approach seeks to overcome the tendency of those engaged in rehab using weights to forget what they learned during rehab once the weight is removed--the weight acts as a literal and figurative crutch. This could mean hope for some 700,000 Americans who experience a stroke each year and the 4.5 million people in the United States who today experience the after-effects of a stroke.
###
[ | E-mail | Share ]
?
AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert! system.
[ | E-mail | Share ]
Contact: Lisa-Joy Zgorski
lisajoy@nsf.gov
703-292-8311
National Science Foundation
Search-and-rescue, planet exploration, home health care and drug delivery are potential applications that offer tremendous economic and societal impacts
The National Science Foundation (NSF), in partnership with NASA, the National Institutes of Health (NIH) and the U.S. Department of Agriculture (USDA), today awarded just under $50 million to grantees around the country for the development and use of robots that cooperatively work with people and enhance individual human capabilities, performance and safety.
These mark the first round of awards of the Obama Administration's National Robotics Initiative (NRI) launched with NSF as the lead federal agency just over a year ago as part of the president's Advanced Manufacturing Partnership Initiative. NSF itself is announcing 31 of the awards totaling nearly $30 million.
Funded projects target the creation of next-generation collaborative robots, or co-robots, with applications in areas such as advanced manufacturing; civil and environmental infrastructure; health care and rehabilitation; military and homeland security; space and undersea exploration; food production, processing and distribution; assistive devices for improving independence and quality of life; and safer driving. Examples of co-robots envisioned include the co-worker in manufacturing, the co-protector in civilian and military venues, and the co-inhabitant assistant in the home of an elder living independently.
Last year, NSF issued a solicitation and managed the merit review process for more than 700 individual proposals requesting over $1 billion in funding. NSF's Directorates for Computer and Information Science and Engineering; Engineering; Education and Human Resources; and Social, Behavioral, and Economic Sciences worked collaboratively with the other agencies. Together, NSF, NASA, NIH and USDA participated in the review process and in all funding decisions.
Each agency applied the goals of NRI against its mission criteria: encouraging robotics research and technology development to enhance aeronautics and space missions for NASA; developing robotic applications for surgery, health intervention, prostheses, rehabilitation, behavioral therapy, personalized care and wellness/health promotion for NIH; promoting robotics research, applications, and education to enhance food production, processing and distribution that benefits consumers and rural communities for USDA; and advancing fundamental robotics research across all areas of national priority for NSF, including advanced manufacturing.
"Harnessing the expertise of the private and public sectors, across many disciplines will advance smart technology and ultimately revolutionize key drivers of America's productivity, economic competitiveness and quality of life," said Farnam Jahanian, assistant director of NSF's CISE Directorate.
Each federal agency announced its awards this morning. Later today, representatives from each participating agency will brief the Congressional Robotics Caucus. For details please see the notice on the Congressional Robotics Caucus website.
What follows is the list of the NSF-funded projects and principal investigators leading the research from each participating university. For projects that involve multiple institutions, the lead institution (and PI) is noted with an asterisk.
Collaborative Research: Multilateral Manipulation by Human-Robot Collaborative Systems
*Stanford University (Allison Okamura). University of California - Santa Cruz (Jacob Rosen), Johns Hopkins University (Gregory Hager), University of California - Berkeley (Pieter Abbeel)
This project seeks to emulate the expert-apprentice relationship using human beings and robots. It focuses on developing ways in which robots can learn from human activity in order to help humans by providing more hands, eyes and brain power as necessary, enabling multilateral manipulation from multiple vantage points. Applications in the manufacturing plant or in the operating room are potentially numerous.
Collaborative Research: Purposeful Prediction: Co-robot Interaction via Understanding Intent and Goals
*Carnegie-Mellon University (James Bagnell), Massachusetts Institute of Technology (Joshua Tenenbaum), University of Washington (Dieter Fox)
This project focuses on recognizing human intention--that is, teaching a robot to forecast what a human is going to do, so that robots may more effectively collaborate with humans. The inability of robots to anticipate human needs and goals today represents a fundamental barrier to the large-scale deployment of robots in the home and workplace. This project seeks to develop a new science of purposeful prediction using algorithms that may be applied to human-robot interaction across a wide variety of domains.
Collaborative Research: Soft Compliant Robotic Augmentation for Human-Robot Teams
*Massachusetts Institute of Technology (Daniela Rus), Harvard University (Robert Wood), the University of Colorado at Boulder (Nikolaus Correll)
This research explores the use of soft materials for robots, specifically in the design of soft compliant fingers and hands, so that humans and robots may more effectively coexist in shared environments. Made more affordable, soft manipulators can enable in-home assistants to easily and unobtrusively navigate the natural world of the elderly or incapacitated.
A Design Methodology for Multi-fingered Robotic Hands with Second-order Kinematic Constraints
*Idaho State University (Alba Perez Gracia), the University of California Irvine (J. Michael McCarthy)
This research focuses on the adoption and integration of specific characteristics of human hands in robots in order to accomplish a desired task, whether that entails lifting a small, unusually-shaped part for assembly or moving a bulky object. This tool will increase the ability of industry to design high performance, cost-effective multi-fingered robotic hands and other end effectors.
Collaborative Research: A Dynamic Bayesian Approach to Real-Time Estimation and Filtering in Grasp Acquisition and Other Contact Tasks
*Rensselaer Polytechnic Institute (Jeffrey Trinkle), State University of New York (SUNY) at Albany (Siwei Lyu)
This project is developing techniques to enable robots to grasp objects or perform other contact tasks in unstructured, uncertain environments with speed and reliability. Using the proposed method, sensor data tracks the continuous motions of manipulated objects, while models of the objects are simultaneously updated. Applications include search and rescue, planetary exploration, manufacturing, even home use with every day and important uncertainties such as effectively moving a bowl whether it is full or empty.
Collaborative Research: Addressing Clutter and Uncertainty for Robotic Manipulation in Human Environments
Carnegie-Mellon University (Siddhartha Srinivasa), Northwestern University (Kevin Lynch)
The long-term goal of this project is to develop personal robots that share a workspace with humans. To achieve the goal of personal robots in homes, the robots must adapt to the humans' living space, which can be cluttered and unstructured. The models here are the messy desk or a crowded refrigerator--in both examples, a robot must be able to extract a specific item or complete another task at hand from a landscape that may also be accessed and altered by humans.
Collaborative Research: Assistive Robotics for Grasping and Manipulation using Novel Brain Computer Interfaces
*Columbia University (Peter Allen), University of California, Davis (Sanjay Joshi)
This project aims to make concrete some of the major goals of assistive robotics using the expertise of an assembled team of experts from the fields of signal processing and control, robotic grasping and rehabilitative medicine. The research team works to create a field-deployable assistive robotic system that will allow severely disabled patients to control a robot arm/hand system to perform complex grasping and manipulation tasks using novel brain muscle computer interfaces.
Collaborative Research: Multiple Task Learning from Unstructured Demonstrations
University of Massachusetts Amherst (Andrew Barto)
This research centers on programming by human demonstration. The project is developing techniques for the efficient, incremental learning of complex robotic tasks by breaking unstructured demonstrations into reusable component skills. A simple interface that allows end-users to intuitively program robots is a key step to getting robots out of the laboratory and into human-cooperative settings in the home and workplace.
A Biologically Plausible Architecture for Robotic Vision
University of California-San Diego (Nuno Vasconcelos)
The project seeks to develop a vision architecture for robots based on biologically inspired examples of high level vision systems (for example a gaze or ascertaining what a human is looking at) that is both biologically plausible and jointly optimal. This system would be useful for attention, object tracking, object and action recognition in both static and dynamic environments.
Context-Driven Haptic Inquiry of Objects Based on Task Requirements for Artificial Grasp and Manipulation
Arizona State University (Veronica Santos)
This work focuses on the sense of touch. The project aims to advance artificial manipulators by integrating a new class of multimodal tactile sensors with artificial, humanlike hands and developing inquiry routines based on contextual touch. Weight given to each mode of tactile sensing (force, vibration, temperature) will also be tuned according to the context of the task. The research explores how to make use of this stimulus, in order to enable assistive robots to better grasp, hold and carry objects.
Contextually Grounded Collaborative Discourse for Mediating Shared Basis in Situated Human Robot Dialogue
Michigan State University (Joyce Chai)
This project focuses on human-robot dialogue, bridging the chasm of understanding between human partners and robots that have completely mismatched capabilities in perceiving and reasoning about the environment. This project centers on developing techniques that will support mediating the shared perceptual basis for effective conversation and task completion. With an ability to use what is known to shed light on what is not yet known (that is, using the power of inference-in situations that give clues to meaning), this research could benefit many applications in manufacturing, public safety and healthcare.
Cooperative Underwater Robotic Networks for Discovery & Rescue
University of Connecticut (Chengyu Cao)
This project aims to develop a cooperative underwater robotic network for exploration, discovery and rescue currently undermined by murky underwater conditions in which traditional acoustic, radio communication networks do not work. So called autonomous underwater vehicles offer inherent advantages over manned vehicles in cost and efficiency, specifically they eliminate the need for life support systems and the potential risk of human life while enabling assessment and damage mitigation after an incident under the water's surface, such as an oil spill.
Core Technologies for MRI-powered Robots
Children's Hospital Corporation (Pierre Dupont)
This project aims to produce robots that can not only tolerate an Magnetic Resonance Imaging (MRI) environment, but can use its attributes to do useful things. These include crawling inside body cavities to perform interventions or becoming robotic prosthetic implants. At the millimeter and sub-millimeter scale, groups of MRI-powered robots could swim inside fluid-filled regions of the body to perform targeted therapies, such as drug and cell delivery, or assemble as a sensor network. MRI system environments are typically challenging for robots. With the use of two testbeds at different scales, the project seeks to create a transformative robotic technology that uses MRI systems to power, command and control robots under the guidance and control of a clinician.
Co-Robots for STEM Education in the 21st Century
University of California-Davis (Harry Cheng)
This project studies how to use co-robot systems and math-oriented co-robotics competitions to enhance student engagement, increase student motivation in learning algebra and subsequent science, technology, engineering and mathematics (STEM) subjects, and to pique interest in pursuing STEM related careers and post-secondary study. Using a unique robotics platform, a Lego-like intelligent modular system designed for K-12 education, this project prepares teachers to engage their students with relevant pedagogy that illustrates abstract math concepts with concrete applications using computing and robotics.
Expert-Apprentice Collaboration
Duke University (Carlo Tomasi)
This research centers on an integration of the classic expert-apprentice relationship into robotics. It develops visual feature-based methods that allow robots to teach humans, as well as learn from them by unifying apprenticeship learning. This project hopes to tap the potential that teaching by demonstration or imitation offers as a potentially powerful and practical approach to realizing the promise of large-scale personal robotics in a wide range of applications. It will also simplify and enable non-expert users to program computers.
Improved safety and reliability of robotic systems by faults/anomalies detection from uninterpreted signals of computation graphs
California Institute of Technology (Richard Murray)
This research centers on detecting error conditions--that is, figuring out when things are going wrong, and/or when conditions may have been tampered with or altered by a human. This project addresses the main challenges of designing robots that can operate around humans to create systems that can guarantee safety and effectiveness, while being robust to the nuisances of unstructured environments, from hardware faults to software issues, erroneous calibration and less predictable anomalies, such as tampering and sabotage.
Measuring Unconstrained Grasp Forces Using Fingernail Imaging
University of Utah (Stephen Mascaro)
This project develops the technology for unconstrained, multi-fingered measurement of human grasp forces using a fingernail imaging technique. Human subjects freely choose where to place their fingers on objects, allowing for unconstrained multi-finger grasping. The co-robot then detects the individual finger forces of a human partner by ascertaining blood flow, as measured through color change on a fingernail. A co-robot trained with the appropriate calibration data could recognize and emulate or adapt to a human partner's grasp forces, measured using only vision.
Mixed Human-Robot Teams for Search and Rescue
University of Minnesota-Twin Cities (Maria Gini)
This research explores how to make groups of robots behave to accomplish a common goal. The project aims at increasing the ability to respond to large-scale disasters and manage emergencies by including robots and agents as team-mates of humans in search and rescue teams. The project focuses on large teams of humans and robots that have only incomplete knowledge of the disaster situation while they accomplish the mission to rescue people and prevent fires.
Multifunctional Electroactive Polymers for Muscle-Like Actuation
University of California-Los Angeles (Qibing Pei)
This project aims to develop a new, softer polymer material that is electronically stimulated to behave like an artificial muscle. This offers a combination of attributes for future robotic systems including power output that outperforms human skeletal muscle, flexibility, quietness, and biocompatibility. Actuators based on the more human muscle-like material enable the design of robotic systems that more comfortably interact with people, such as assistive prosthesis or assistive devices for people with disabilities, humanoid robots for elderly in-home care, and surgical robots to save lives.
Multi-modal sensor skin and garments for healthcare and home robots
University of Texas at Arlington (Dan Popa)
This research seeks to build skin material that functions like human skin to give robots a learned sense of touch similar to that of humans. The objective of this research is to answer fundamental design questions for multi-functional robotic skin sensors and to optimize their placement onto assistive robotic devices. The aim is to teach the robot how to use the skin sensors efficiently, and quantitatively assess the impact of this assistive technology on humans. The research may unlock near-term and unforeseen applications of robotic skin with broad applicability, and especially to home assistance, medical rehabilitation and prosthetics.
Perceptually Inspired Dynamics for Robot Arm Motion
University of Wisconsin-Madison (Michael Gleicher)
This project seeks to enable a computer to learn from its own trials, errors and successes how to move and how to plan future appropriate motions. Researchers are working to develop an understanding of human perception of movement that can be applied to the development of robot trajectory planning and control algorithms, using human subjects experiments to understand and evaluate the interpretation of movements and apply these findings in robotics and motion synthesis.
Robot Assistants for Promoting Crawling and Walking in Children at Risk of Cerebral Palsy
University of Oklahoma Norman Campus (Andrew Fagg)
This research is developing effective robotic assistance tools to teach infants with or at risk of developing cerebral palsy how to walk and move, mitigating future deficits in cognitive development, that are considered neural side effects of this disease. This project will develop and test a sequence of robotic assistants that promote early crawling, creeping, and walking, along with a model of infant-robot interaction that encourages the continued practice of movement patterns that will ultimately lead to unassisted locomotion. The robotic assistants to be developed in this project will aid the infant in developing locomotory skills by selectively supporting a portion of his/her weight and providing artificial, rewarding locomotory experiences.
Robot Movement for Patient Improvement - Therapeutic Rehabilitation for Children with Disabilities
Georgia Tech Research Corporation (Ayanna Howard)
This research is focused on developing state-of-the-art techniques to facilitate the interaction necessary for robots to be useful for therapeutic rehabilitation with respect to children. Based on the logic that animate playthings naturally engage children, the goal of this project is to fuse play and rehabilitation techniques using a robotic design to induce child-robot interaction that will be entertaining as well as effective for pediatric rehabilitation. Of importance within this proposed work are approaches that allow therapists to provide instruction to robots on rehabilitation tasks that can be remapped to play behaviors specific to an individual child. In addition, robots must have internal perception and inference algorithms that allow them to learn new play behaviors and incorporate them to evoke corresponding behaviors in the child.
Robotic Treadmill Therapy for Lower Spinal Cord Injuries
University of Utah (John Hollerbach)
This project is developing new rehabilitation therapies for patients with incomplete lower spinal cord injuries, specifically the use of a body-weight assisted robotic treadmill that provides a realistic walking experience in a safe and flexible virtual environment. The "Treadport" overcomes limitations of current rehabilitation treadmills, which are too dissimilar from everyday walking and therefore limit a patient's recovery. It works with the patient's walking speed and effort, resistance to falling by strengthening and training a patient to unexpected perturbations and arm swing coordination, critical to normal walking.
Robust, highly-mobile MEMS micro-robots based on integration of piezoelectric and polymer materials
University of Michigan Ann Arbor (Kenn Oldham)
This project focuses on developing tiny, millimeter or sub-millimeter scale robots (smaller than fleas) whose skeletal system is composed of crystal and ceramic, which makes them highly maneuverable with stronger, mini muscles. Prototypes will be developed, tested and perfected. These micro-robots could be eventually used to get into hard-to-reach places, and to crawl around to observe things up close or to complete a task. Applications range from exploration to surveillance, from observation to micro-surgery.
Spacial Primitives for Enabling Situated Human-Robot Interaction
University of Southern California (Maja Mataric)
This research centers on the creation of "social robotics"--that is, robots that interact in appropriate social ways as understood by humans, which may include deference to personal or social space, appropriate gestures and the use of verbal and physical comments. This project focuses on answering the question: how do social (speech and gesture), environmental (loud noises and low lighting), and personal (hearing and visual impairments) factors influence positioning and communication between humans and co-robots, and how should a co-robot adjust its social behaviors to maximize human perception of its social signals?
The Intelligent Workcell - Enabling Robots and People to Work Together Safely in Manufacturing Environments
Carnegie-Mellon University (Paul Rybski)
This research will develop an "Intelligent Workcell," to enable people and industrial robots to work safely and more efficiently within the same workspace. New capabilities in robotic workcell monitoring will likely result. Smart work environments know where you are and what you need, and what you're doing to avoid hindrance and to support assistance.
Virtualized Welding: A New Paradigm for Intelligent Welding Robots in Unstructured Environment
University of Kentucky Research Foundation (Ruigang Yang)
Zeroing in on welding done with widespread use as a manufacturing component and done by highly skilled workers, this project will develop a new robotic platform with novel 3D modeling and visualization algorithms designed to complement the skills and expertise of a human welder with advanced sensing tools of a robotic one. The primary use for this new technology is in manufacturing. Successful completion of the proposed project paves the foundation for intelligent welding robots with closed-loop intelligent control. Such a robotic system can perform high-speed and high-precision welding while allowing more variations in the work pieces and environments. In addition, virtualized welding can be integrated with a mobile platform to allow welding in places that are hazardous or unsuitable for human welders.
Human-Robot Collectives as a Curriculum-Wide CS Learning Platform
Rochester Institute of Technology (Zack Butler)
This project is an effort to re-conceptualize what it means to study computer science at the undergraduate level. The project team will design a sequence of computer science courses that integrate the use of a network of robots to facilitate student learning. In this project, the co-robot teams share space and tasks with humans and are used as a teaching platform in an introductory context, and later, as a laboratory platform for projects in intermediate and upper-level courses in which students can develop and even invent new services. This approach enhances a traditional approach to teaching computer science and provides ample opportunities for students to design, test, and evaluate using co-robot systems.
Managing Uncertainty in Human-Robot Cooperative Systems
Johns Hopkins University (Peter Kazanzides)
This research aims to capitalize on the distinct yet different strengths of humans (perception and reasoning) and machines (precision, accuracy and repetitiveness in information fusion, task planning and simulation) to design truly cooperative systems, managing uncertainty to achieve successful human-robot partnerships to perform complex tasks in uncertain environments. It will build manufacturing and medical testbeds (for minimally invasive surgery during which slight variations such as tremors, twitches or breaths can affect conditions) on which cooperative skills will be applied and tested.
A Novel Light-weight Cable-driven Active Leg Exoskeleton (C-ALEX) for Training of Human Gait
University of Delaware (Sunil K. Agrawal)
This research intends to improve gait training rehabilitation for stroke patients. It will help victims of stroke learn how to walk again. Unlike "clamp robots" that affix to body parts to bolster a patient's muscle strength, this robotic machine is more of a lightweight, puppet system that reduces muscle burden. The person engaged in rehabilitation does not feel anything that is unlike their own limbs (no artificial encumbrance, nothing bulky or uncomfortable). This approach seeks to overcome the tendency of those engaged in rehab using weights to forget what they learned during rehab once the weight is removed--the weight acts as a literal and figurative crutch. This could mean hope for some 700,000 Americans who experience a stroke each year and the 4.5 million people in the United States who today experience the after-effects of a stroke.
###
[ | E-mail | Share ]
?
AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert! system.
Source: http://www.eurekalert.org/pub_releases/2012-09/nsf-nm091712.php
the tree of life movie academy award nominees 2012 2012 oscar nominations kyle williams florida debate rand paul mark kirk
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.