Nicholas Giudice

Office: 331 Boardman Hall

Lab: 236 Boardman Hall and Carnegie Hall

Nicholas Giudice is Professor of Spatial Computing in the School of Computing and Information Science and cooperating faculty in UMaine’s Dept. of Psychology at the University of Maine. He is the founding Director of the Virtual Environments and Multimodal Interaction (VEMI) Laboratory and Director of the UMaine National Center for Geographic Information and Analysis (NCGIA).


  • Postdoctoral Research Fellow (2005-2008), Psychological and Brain Sciences Program, Psychology Department, University of California, Santa Barbara
  • Ph.D. (2004), Cognitive and Brain Sciences, University of Minnesota

Research interests

Nicholas Giudice is director of the VEMI lab, which houses Maine’s only research facility combining a fully immersive virtual reality (VR) installation with augmented reality (AR) technologies in an integrated research and development environment. His research program combines techniques from Experimental Psychology, cognitive neuroscience,  and Human-Computer Interaction with an emphasis on studying multimodal spatial cognition (vision, touch, audition, and language). He has specific research interests in the domains of spatial learning and navigation with and without vision across the lifespan and in the specification of information requirements for the design and evaluation of spatial interfaces for use in assisstive technology, gerontechnology, and autonomous vehicle interaction. Giudice has authored or co-authored over 150 publications, including journal articles, conference papers, book chapters, and edited volumes. Since 2008, Dr. Giudice has been the PI or Co-PI on 15 funded projects, totaling over $9 million from various sponsors including: the National Science Foundation (NSF), the National Institutes of Health (NIH), the National Institute of Disability Research and Rehabilitation (NIDRR), and several SBIR/STTR industrial partnerships. The VEMI Lab also frequently contracts with commercial partners on virtual and augmented reality projects or for developing multimodal information access technology.