Robots Can Feel: LLM-based Framework for Robot Ethical Reasoning Paper • 2405.05824 • Published May 9
Co-driver: VLM-based Autonomous Driving Assistant with Human-like Behavior and Understanding for Complex Road Scenes Paper • 2405.05885 • Published May 9
Bi-VLA: Vision-Language-Action Model-Based System for Bimanual Robotic Dexterous Manipulations Paper • 2405.06039 • Published May 9
VR-GPT: Visual Language Model for Intelligent Virtual Reality Applications Paper • 2405.11537 • Published May 19
DogSurf: Quadruped Robot Capable of GRU-based Surface Recognition for Blind Person Navigation Paper • 2402.03156 • Published Feb 5
CognitiveOS: Large Multimodal Model based System to Endow Any Type of Robot with Generative AI Paper • 2401.16205 • Published Jan 29
LLM-BRAIn: AI-driven Fast Generation of Robot Behaviour Tree based on Large Language Model Paper • 2305.19352 • Published May 30, 2023
DeltaFinger: a 3-DoF Wearable Haptic Display Enabling High-Fidelity Force Vector Presentation at a User Finger Paper • 2211.00752 • Published Nov 1, 2022 • 1