New: Recording is now avaible on YouTube.

The workshop titled "Learning Dexterous Manipulation" aims to investigate learning-based approaches for dexterous manipulation with a high level of generalizability. Dexterous manipulation has been one of the most challenging problems in robotics, and this workshop intends to offer insights and perspectives to researchers and participants on this topic. Additionally, the latest advancements in various sensing technologies will also be discussed. The ultimate goal of the workshop is to equip participants with the knowledge and skills necessary to design and develop advanced robotic systems capable of performing complex manipulation tasks with perception and enhancing human-robot interaction and collaboration.

This workshop is intended for researchers, engineers, and students who have a solid background in learning-based approaches, computer vision, or other related fields, and are interested in robotics and robot sensing. The presenters and panelists for the workshop will include experts from both academic and industrial backgrounds, representing a variety of disciplines, such as robot learning, robotics, mechanical engineering, and robot sensing. Accepted papers will have a chance to be presented during the poster session, and selected papers will be featured in contributed talks. The workshop will be promoted through relevant mailing lists of universities and research institutes, as well as social media platforms. Here are the topics we are interested in, covering recent advancements and open questions in the context of learning dexterous manipulation. We hope to connect researchers from the communities of dexterous robotics, representation learning, computer vision, and to induce collaborations in this exciting new domain, while providing a platform to discuss recent developments, challenges and tradeoffs.

Speakers and panelists


July 14th 2023

Times are given in Korea Standard Time (UTC+09:00)

  • 1:30pm - 1:45pm: Welcome and Online login
  • 1:45pm - 2:15pm: Invited Talk 1: Pulkit Agrawal (remote)
  • 2:15pm - 2:45pm: Invited Talk 2: Vikash Kumar (remote)
  • 2:45pm - 3:15pm: Spotlight presentation
  • 3:15pm - 3:45pm: Coffee break and poster session
  • 3:45pm - 4:15pm: Invited Talk 3: Tess Hellebrekers
  • 4:15pm - 4:45pm: Invited Talk 4: Abhishek Gupta
  • 4:45pm - 5:15pm: Invited Student Speakers
  • 5:15pm - 5:30pm: Closing remarks


Learning a Universal Human Prior for Dexterous Manipulation from Human Preference [PDF]

UniDexGrasp++: Improving Dexterous Grasping Policy Learning via Geometry-aware Curriculum and Iterative Generalist-Specialist Learning [PDF][Material]

Online augmentation of learned grasp sequence policies for more adaptable in-hand manipulation [PDF]

Teach a Robot to FISH: Versatile Imitation from One Minute of Demonstrations [PDF]

DexGraspNet: A Large-Scale Robotic Dexterous Grasp Dataset for General Objects Based on Simulation [PDF]

Rotating without Seeing: Towards In-hand Dexterity through Touch [PDF]

Dexterity from Touch: Self-Supervised Pre-Training of Tactile Representations with Robotic Play [PDF]

DexArt: Benchmarking Generalizable Dexterous Manipulation with Articulated Objects [PDF]

A Robust and Accurate System for Data Acquisition of Dexterous Manipulation [PDF]

Tactile Pose Feedback for Closed-loop Manipulation Tasks [PDF]

LEAP Hand: Low-Cost, Efficient, and Anthropomorphic Hand for Robot Learning [PDF]

DEFT: Dexterous Fine-Tuning for Real World, General Purpose Manipulation [PDF]

On the Utility of Koopman Operator Theory in Learning Dexterous Manipulation Skills [PDF]

The Power of the Senses: Generalizable Manipulation from Vision and Touch through Masked Multimodal Learning [PDF]

SpawnNet: Learning Generalizable Visuomotor Skills from Pre-trained Networks [PDF]

Dynamic Handover: Throw and Catch with Bimanual Hands [PDF]

Call for papers

Important dates (all times AoE)

  • Submissions open: April 19 2023
  • Submission deadline: June 2 2023   June 16 2023
  • Decision notification: June 16 2023  June 26 2023
  • Camera ready deadline: July 7 2023
  • Workshop: July 14th 2023

Call for papers

Submission link: CMT

In this workshop, we aim to bring together machine learning and robotics researchers who work at the intersection of these fields. We invite researchers to submit work in the following or related areas (non-exhaustive list):

  • Data for Dexterous Manipulation:
    • Can human hand data for dexterous manipulation be collected in a general way using any expert-grade equipment? What is the data gap between human and robot hands?
    • How can we improve current data collection methods, such as teleoperation, to facilitate large-scale data collection?
  • Computer Vision:
    • How can occlusion between objects and robot hands during dexterous manipulation be addressed?
    • How can policies generalize to the open world outside the lab environment, considering the relatively unpredictable changes in outdoor lighting and the vast amount of information that needs to be processed?
  • Tactile Information:
    • How can tactile information help robots better accomplish tasks and perceive their environment?
    • What kind of tactile information is best suited for dexterous robot hands, and can it compensate for the shortcomings of visual perception?
  • Robot learning
    • Will we see a unified and generalized model for most daily dexterous manipulation tasks or a specialized model for each individual task?
    • How can learning-based policies handle dynamic tasks that require high-frequency control and detailed dynamics models?
  • Any other related topics we might have forgotten in the list above 😄

Accepted Talks and Posters

Accepted papers will be presented in the form of posters (with lightning talks) or spotlight talks at the workshop. We encourage submissions of work in progress, as well as work that is not yet published.

Submission instructions

  • Kindly utilize the RSS 2023 template when submitting your paper. Including supplementary material is optional and only required if you wish to offer additional details or video demonstrations.
  • Submissions should be short papers up to 4 pages in PDF format (not counting references and an optional appendix, which can go over the limit)
  • All submitted materials will undergo a double-blind review process. While this workshop will not produce formal official proceedings, the accepted papers will be accessible on the workshop website. As this does not count as an archival publication. Consequently, authors are at liberty to publish their work in archival journals or conferences.



For questions and comments, please contact us.