Keita Higuchi is a researcher at Preferred Networks. He received his B.A. degree in Kanazawa Institute of Technology. He also received his M.S. and Ph.D. degree in the University of Tokyo. He was supervised by Prof. Jun Rekimoto at Rekimoto Lab on Human-UAV interaction projects which won a Emerging Technologies Prize at ACM Siggraph Asia 2012.
He joined Multimedia, Interaction, and Communication (MIC) group, Microsoft Research Redmond as a research intern student twice on the Viiboard Project that his paper was accepted by CHI 2015. He was working with Prof. Yoichi Sato on the CREST project as a Project Research Associate/Lecturer at Institute of Industrial Science. The project focuses on understanding group attention and activities by analyzing information gathered from multiple wearable devices, such as wearable cameras and eye trackers.
He is interested in remote collaboration and video browsing interface studies that the papers were published in CHI 2016, CHI 2017, and IUI 2019 respectively. He is also actively working on accessibility technologies for people with disabilities (e.g. blind, and ASD) that the paper was accepted to IUI 2018 and CHI 2019.
Keita Higuchi, Hiroki Tsuchida, Eshed Ohn-Bar, Yoichi Sato, Kris Kitani, Learning Context-Dependent Personal Preferences for Adaptive Recommendation, to appear in ACM Transaction on Interactive Intelligent Systems [Accepted!]
Rie Kamikubo, Naoya Kato, Keita Higuchi, Ryo Yonetani, Yoichi Sato, Support Strategies for Remote Guides in Assisting People with Visual Impairments for Effective Indoor Navigation, to appear in CHI 2020, April 2020. [Accepted!][preprint]
Seita Kayukawa, Keita Higuchi, João Guerreiro, Shigeo Morishima, Yoichi Sato, Kris Kitani, Chieko Asakawa, BBeep: A Sonic Collision Avoidance System for Blind Travellers and Nearby Pedestrians, Proceedings of CHI 2019, May 2019. [Accepted!][project page]
Irshad Abibouraguimane, Kakeru Hagihara, Keita Higuchi, Yuta Itoh, Yoichi Sato, Tetsu Hayashida, and Maki Sugimoto, CoSummary: Adaptive Fast-Forwarding for Surgical Videos by Detecting Collaborative Scenes Using Hand Regions and Gaze Positions, Proceedings of IUI 2019, March 2019. [Accepted!]
Keita Higuchi, Soichiro Matsuda, Rie Kamikubo, Takuya Enomoto, Yusuke Sugano, Jun'ichi Yamamoto, and Yoichi Sato, Visualizing Gaze Direction to Support Video Coding of Social Attention for Children with Autism Spectrum Disorder, Proceedings of IUI 2018. [preprint]
Keita Higuchi, Ryo Yonetani, and Yoichi Sato, EgoScanning: Quickly Scanning First-Person Videos with Egocentric Elastic Timelines, Proceedings of CHI 2017. [preprint]
Keita Higuchi, Ryo Yonetani, and Yoichi Sato, Can Eye Help You?: Effects of Visualizing Eye Fixations on Remote Collaboration Scenarios for Physical Tasks, Proceedings of CHI 2016. [preprint]
Keita Higuchi, Yinpeng Chen, Philip A Chou, Zhengyou Zhang, Zicheng Liu, ImmerseBoard: Immersive Telepresence Experience using a Digital Whiteboard, Proceedings of CHI 2015, Seoul, Korea, 2015. [pdf]
Kei Nitta, Keita Higuchi, Yuichi Tadokoro, Jun Rekimoto, Shepherd Pass: Ability Tuning for Augmented Sports using Ball-Shaped Quadcopter, Proceedings of ACE 2015, Iskandar, Malaysia, 2015. [preprint]
Keita Higuchi, Katsuya Fujii, Jun Rekimoto Flying Head: A Head-Synchronization Mechanism for Flying Telepresence，The 23rd h IEEE International Conference on Artificial Reality and Telexistence (ICAT 2013)，December 11-13, 2013, Tokyo, Japan.[preprint]