Hi! I am Tanmay Randhavane. I am a graduate student of Computer Science at University of North Carolina at Chapel Hill. I am from a
in Maharashtra, India. I completed my Bachelor's studies in Computer Science at Indian Institute of Technology Bombay with honors and minor in Statistics. Currently, I am working under the guidance of Prof. Dinesh Manocha on a project involving social perception of pedestrians and virtual agents using non-verbal movements for human-robot interaction, virtual agent simulation, and affective computing.
Social Perception of Pedestrians | Ongoing
Researching mathematical models for the perception of emotions, dominance, friendliness, and approachability of pedestrians using non-verbal movement cues: trajectories, gaits, gestures, gazing.
Socially-Aware Robot Navigation | Ongoing
Researching novel algorithms for robot navigation around humans based on their personalities, emotions, dominance, and friendliness.
Virtual Agent Simulation | Ongoing
Developing novel algorithms for the simulation of virtual agents with different emotions, dominance, and friendliness levels.
Snap Inc., Venice, CA | May 2018 - August 2018
Researched algorithms to identify emotions from RGB videos using LSTM-based deep features and psychology-based affective features.
Developed data-driven methods to simulate variety of emotions for virtual agents using gaits and gazing.
Amazon Development Center, Bangalore, India | May 2014 - July 2014
Worked in the Amazon Fulfillment Technologies (AFT) team to create a testing framework for a data platform service.
Developed a user friendly framework providing the ability to create functional and integration tests, to generate mock messages and to publish them to corresponding queues.
Virtual Digital Assistant for Augmented Reality | January 2019 - May 2019
Simulated a virtual lab assistant in AR with realistic appearance and friendliness characteristics.
Implemented varying levels of friendliness for virtual agents based on gaits, gestures, and gazing.
Emotionally Intelligent Robot | August 2018 - Ongoing
Implemented an emotionally-aware navigation algorithm on the Pepper robot navigating around multiple pedestrians.
Robot performed navigation according to the pedestrians’ emotions and dominance determined using their trajectories and facial expressions.
DEBS Grand Challenge: Smart Grids | January 2014 - May 2014
Designed a framework to process over a million events per second from smartplugs to identify outliers and predict energy requirements.
Aman Mangal, Arun Mathew,
, and Umesh Bellur. ”Predicting power needs in smart grids.”
In Proceedings of the 8th ACM International Conference on Distributed Event-Based Systems
, pp. 298-301. ACM, 2014.
, Aniket Bera, Emily Kubin, Austin Wang, Kurt Gray, and Dinesh Manocha. “Pedestrian Dominance Modeling for Socially-Aware Robot Navigation.”
2019 IEEE International Conference on Robotics and Automation (ICRA)
, pp. 5528-5535. IEEE, 2019.
, Aniket Bera, Kyra Kapsaskis, Kurt Gray, and Dinesh Manocha. “FVA: Modeling Perceived Friendliness of Virtual Agents Using Movement Characteristics.”
To Appear in TVCG Special Issue for 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
, IEEE, 2019.
, Aniket Bera, Kyra Kapsaskis, Rahul Sheth, Kurt Gray, and Dinesh Manocha. “EVA: Generating Emotional Behavior of Virtual Agents using Expressive Features of Gait and Gaze.”
ACM Symposium on Applied Perception ACM SAP 2019
, Aniket Bera, and Dinesh Manocha. “F2FCrowds: Planning agent movements to enable face-to-face interactions.”
Presence: Teleoperators and Virtual Environments
26, no. 2 (2017): 228-246
, Aniket Bera, Emily Kubin, Kurt Gray, and Dinesh Manocha. “Modeling Data-Driven Dominance Traits for Virtual Characters using Gait Analysis.”
Under review for IEEE Transactions on Visualization and Computer Graphics (TVCG)
, Aniket Bera, Kyra Kapsaskis, Uttaran Bhattacharya, Kurt Gray, and Dinesh Manocha. “Identifying Emotions from Walking using Affective and Deep Features.”
Under review for IEEE Transactions on Affective Computing (IEEE TAFFC).
Best Poster Award at ACM Symposium on Applied Perception 2019 (SAP ’19)
, Uttaran Bhattacharya, Kyra Kapsaskis, Aniket Bera, Kurt Gray, and Dinesh Manocha. “The Liar's Walk: Detecting Deception with Gait and Gesture.”
Under review for ACM CHI Conference on Human Factors in Computing Systems (CHI 2020).
, Rohan Prinja, Kyra Kapsaskis, Austin Wang, Kurt Gray, and Dinesh Manocha. “The Emotionally Intelligent Robot: Improving Social Navigation in Crowded Environments.”
Under Review for 2020 International Conference on Robotics and Automation (ICRA)
, IEEE, 2020.
, Emily Kubin, Husam Shaik, Kurt Gray, and Dinesh Manocha. “Datadriven modeling of group entitativity in virtual environments.”
In Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology (VRST)
, p. 31. ACM, 2018.
, Emily Kubin, Austin Wang, Kurt Gray, and Dinesh Manocha. “The Socially Invisible Robot Navigation in the Social World Using Robot Entitativity.”
In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
, pp. 4468-4475. IEEE, 2018.
, and Dinesh Manocha. “Aggressive, Tense or Shy? Identifying Personality Traits from Crowd Videos.”
, pp. 112-118. 2017.
Aniket Bera, Sujeong Kim,
, Srihari Pratapa, and Dinesh Manocha. “GLMP-realtime pedestrian path prediction using global and local movement patterns.”
In 2016 IEEE International Conference on Robotics and Automation (ICRA)
, pp. 5528-5535. IEEE, 2016.
Sahil Narang, Andrew Best,
, Ari Shapiro, and Dinesh Manocha. “PedVR: Simulating gaze-based interactions between a real user and virtual crowds.”
In Proceedings of the 22nd ACM conference on virtual reality software and technology (VRST)
, pp. 91-100. ACM, 2016.
SN 336, Sitterson Hall, Department of Computer Science, Chapel Hill, NC 27514.