Charalambos (Charis) Poullis

A novel AI-driven approach to redirected walking in virtual reality that eliminates the need for eye-tracking hardware.
patent Featured

A novel AI-driven approach to redirected walking in virtual reality that eliminates the need for eye-tracking hardware.

Our patent "Methods and Systems for Real-Time Saccade Prediction" has been granted. TL;DR: A machine learning system that predicts natural saccadic eye movements in real-time, enabling redirected walking in virtual environments by leveraging inattentional blindness—without requiring expensive eye-tracking equipment or artificially triggering major saccades in users.
1 min read
Fast Self-Supervised Depth and Mask Aware Association for Multi-Object Tracking
tracking Featured

Fast Self-Supervised Depth and Mask Aware Association for Multi-Object Tracking

The paper “Fast Self-Supervised Depth and Mask Aware Association for Multi-Object Tracking” by Milad Khanchi, Maria Amer, and Charalambos Poullis has been accepted for publication at British Machine Vision Conference (BMVC) 2025. TL;DR: SelfTrEncMOT is a novel multi-object tracking framework that integrates zero-shot monocular depth estimation and promptable segmentation
1 min read
Pix2Poly: A Sequence Prediction Method for End-to-end Polygonal Building Footprint Extraction from Remote Sensing Imagery
building footprint Featured

Pix2Poly: A Sequence Prediction Method for End-to-end Polygonal Building Footprint Extraction from Remote Sensing Imagery

The paper 'Pix2Poly: A Sequence Prediction Method for End-to-end Polygonal Building Footprint Extraction from Remote Sensing Imagery' by Yeshwanth Kumar Adimoolam, Charalambos Poullis, and Melinos Averkiou has been accepted for publication in IEEE/CVF WACV 2025. TL;DR: This work introduces Pix2Poly, an attention-based, end-to-end trainable, and differentiable
1 min read
Predicting Human Performance in Vertical Hierarchical Menu Selection in Immersive AR Using Hand-gesture and Head-gaze, HSI 2022
augmented reality Featured

Predicting Human Performance in Vertical Hierarchical Menu Selection in Immersive AR Using Hand-gesture and Head-gaze, HSI 2022

The paper titled Predicting Human Performance in Vertical Hierarchical Menu Selection in Immersive AR Using Hand-gesture and Head-gaze authored by Majid Pourmemar, Yashas Joshi and Charalambos Poullis will be presented at the 15th Conference on Human System Interaction, 2022. Abstract: There are currently limited guidelines on designing user interfaces (UI)
1 min read