Publications

You can also find my articles on my Google Scholar profile.

Post Image

Tactile Neural De-rendering

J.A. Eyzaguirre, Miquel Oller, Nima Fazeli

Under Review, 2024

We introduce Tactile Neural De-rendering, a novel approach that leverages a generative model to reconstruct a local 3D representation of an object based solely on its tactile signature.

Post Image

Contrastive Touch-to-Touch Pretraining (CTTP)

Samanta Rodriguez, Yiming Dou, William van den Bogert, Miquel Oller, Kevin So, Andrew Owens, Nima Fazeli

Under Review, 2024

We present a contrastive self-supervised learning method to unify tactile feedback across different sensors, using paired tactile data. By treating paired signals as positives and unpaired ones as negatives, our approach learns a sensor-agnostic latent representation, capturing shared information without relying on reconstruction or task-specific supervision.

Post Image

Touch2Touch: Cross-Modal Tactile Generation for Object Manipulation

Samanta Rodriguez, Yiming Dou, Miquel Oller, Andrew Owens, Nima Fazeli

Under Review, 2024

The diversity of touch sensor designs complicates general-purpose tactile processing. We address this by training a diffusion model for cross-modal prediction, translating tactile signals between GelSlim and Soft Bubble sensors. This enables sensor-specific methods to be applied across sensor types.

Post Image

Neural Inverse Source Problems

Youngsun Wi, Jayjun Lee, Miquel Oller, Nima Fazeli

8th Conference on Robotic Learning (CoRL), 2024

We propose a Physics-Informed Neural Network (PINN) approach for solving inverse source problems in robotics, jointly identifying unknown source functions and system states from partial, noisy observations. Our method integrates diverse constraints, avoids complex discretizations, accommodates real measurement gradients, and is not limited by training data quality.

Post Image

Tactile-Driven Non-Prehensile Object Manipulation via Extrinsic Contact Mode Control

Miquel Oller, Dmitry Berenson, Nima Fazeli

Robotic Science and Systems (RSS), 2024

We consider the problem of non-prehensile manipulation with highly compliant and high-resolution tactile sensors. Our approach considers contact mechanics and sensor dynamics to achive desired object poses and transmitted forces and is amenable for gradient-based optimization.

Post Image

See, feel, act: Hierarchical learning for complex manipulation skills with multisensory fusion

Nima Fazeli, Miquel Oller, Jiajun Wu, Zheng Wu, J. B. Tenenbaum, Alberto Rodriguez

Science Robotics, 2019

This work introduces a methodology for robots to learn complex manipulation skills, such as playing Jenga, by emulating hierarchical reasoning and multisensory fusion through a temporal hierarchical Bayesian model. By leveraging learned tactile and visual representations, the robot adapts its actions and strategies similar to human gameplay.