{"id":5070,"date":"2025-06-25T14:58:58","date_gmt":"2025-06-25T18:58:58","guid":{"rendered":"https:\/\/umaine.edu\/vemi\/?page_id=5070"},"modified":"2025-06-25T14:58:59","modified_gmt":"2025-06-25T18:58:59","slug":"research-statement-contributions","status":"publish","type":"page","link":"https:\/\/umaine.edu\/vemi\/nick\/dr-nicholas-giudice\/about-me-nick__trashed\/research-statement-contributions\/","title":{"rendered":"Research Contributions and Statement"},"content":{"rendered":"\n<p><strong>Research Statement<\/strong><\/p>\n\n\n\n<p>My research program is inherently interdisciplinary, combining principles from human perception,&nbsp;cognitive neuroscience, and human-computer interaction. My mission (and that of the VEMI Lab) is to&nbsp;envision, develop, and evaluate human-inspired nonvisual, enhanced visual, and Multimodal&nbsp;information access technologies for improving environmental awareness, spatial learning, and&nbsp;navigation. Our solutions make a difference in people\u2019s lives by providing immediate benefits on the&nbsp;information access needs of blind\/visually impaired people (representing 12 million persons in the U.S.&nbsp;and 285 million worldwide), as well as older adults experiencing vision loss (most visual impairment is&nbsp;age-related and the reality is that 70-year- old eyes are not as keen as 20-year- old eyes). Visual&nbsp;impairment need not be physical or permanent, sighted people are also frequently \u201cblind\u201d to their&nbsp;environment. My research program addresses these scenarios based on solutions for what we call&nbsp;situational blindness (e.g., texting while walking), eyes-free applications (such as performing a secondary&nbsp;task while driving), and when accurate imagination requires more than visual information (such as for&nbsp;understanding the sight\/sound characteristics of a new windfarm installation).&nbsp;<\/p>\n\n\n\n<p>My basic research program has been influential to experimental psychologists and cognitive&nbsp;neuroscientists in the domain of blind spatial cognition, navigation, and multimodal information&nbsp;processing, as well as applied by computer scientists and engineers to the development of multimodal&nbsp;information access technology and sensory substitution devices. My experiences as a congenitally blind&nbsp;person provide me with unrivaled first-hand knowledge about the needs and challenges of this&nbsp;demographic and key insight of what works and doesn\u2019t work for the design of nonvisual information&nbsp;access technology, something that is often misunderstood by researchers\/ designers without this&nbsp;phenomenology.<\/p>\n\n\n\n<p><strong>Primary Contributions to Science<\/strong><\/p>\n\n\n\n<p>Below are several programmatic areas of particular interest where I believe my work has made the&nbsp;greatest scientific contributions and been the most impactful to both technology designers and end-users.<\/p>\n\n\n\n<p><a href=\"https:\/\/umaine.edu\/vemi\/facultystudents\/dr-nicholas-giudice\/about-me-nick\/wayfinding-with-words\/\"><em>I. Wayfinding with words:<\/em><\/a><\/p>\n\n\n\n<p>This line of research studies how spatial language, spatialized audio, and real-time verbal descriptions&nbsp;can be used to support nonvisual wayfinding and cognitive map development in large-scale real and&nbsp;virtual environments (with an emphasis on indoor spaces). It also addresses use of verbal descriptions&nbsp;and other nonvisual information providing access to local \u201cscenes\u201d for people who cannot see their&nbsp;surrounds, e.g. BVI individuals or sighted folks operating in the dark.<\/p>\n\n\n\n<p><a href=\"https:\/\/umaine.edu\/vemi\/facultystudents\/dr-nicholas-giudice\/about-me-nick\/multimodal-spatial-cognition-msc\/\"><em>II. Multimodal Spatial Cognition (MSC):<\/em><\/a><\/p>\n\n\n\n<p>Most spatial cognition research only addresses visual-spatial information and ignores the&nbsp;role of other spatial inputs. My research compares spatial learning, updating, and&nbsp;wayfinding behavior within and between modalities (3-D sound, touch, vision, and spatial&nbsp;language). I employ both behavioral and neuroimaging paradigms and incorporate both BVI&nbsp;and sighted people across a range of ages and abilities.<\/p>\n\n\n\n<p><a href=\"https:\/\/umaine.edu\/vemi\/facultystudents\/dr-nicholas-giudice\/about-me-nick\/blindness-visual-impairment\/\"><em>III. Blindness and visual impairment:<\/em><\/a><\/p>\n\n\n\n<p>Most of my interests relate in some way to nonvisual or multimodal spatial abilities and related&nbsp;technologies but this line of work deals specifically with theories and technologies related to blind&nbsp;and visually impaired (BVI) people. The overarching theme is that the majority of challenges,&nbsp;differences, and problems cited in the literature regarding BVI spatial abilities are due to insufficient&nbsp;information access from nonvisual sensing or inadequate spatial problem solving abilities, rather&nbsp;than vision loss per se.<\/p>\n\n\n\n<p><a href=\"https:\/\/umaine.edu\/vemi\/facultystudents\/dr-nicholas-giudice\/about-me-nick\/multimodal-information-access-technology\/\"><em>IV. Multimodal Information Access Technology:<\/em><\/a><\/p>\n\n\n\n<p>Much of my recent research has dealt with the design, development, and usability&nbsp;evaluation of multimodal information access technology (MIAT) to support spatial&nbsp;perception, environmental awareness, and wayfinding behavior without vision (solutions for&nbsp;blind people), with reduced vision (solutions for visually impaired or older people), or with&nbsp;distracted vision (solutions for sighted people operating in eyes-free situations or who are&nbsp;situationally blind to their environment, for instance, texting while walking).<\/p>\n\n\n\n<p><a href=\"https:\/\/umaine.edu\/vemi\/facultystudents\/dr-nicholas-giudice\/about-me-nick\/spatial-aging-navigation\/\"><em>V. Spatial Aging and Navigation:<\/em><\/a><\/p>\n\n\n\n<p>This research investigates how navigation and other spatial behaviors change across the&nbsp;lifespan as people age. Results are used to develop new spatial gerontechnologies to&nbsp;mitigate problems identified. This work is timely as our population is rapidly aging and&nbsp;normal declines in spatial abilities can have detrimental effects on independence, wellbeing,&nbsp;and quality of life for older adults.<\/p>\n\n\n\n<p><a href=\"https:\/\/umaine.edu\/vemi\/facultystudents\/dr-nicholas-giudice\/about-me-nick\/multimodal-information-visualization-miv\/\"><em>VI. Multimodal Information Visualization (MIV):<\/em><\/a><\/p>\n\n\n\n<p>Humans often have trouble imagining complex data, scenes, or environments. This&nbsp;challenge is exacerbated by use of traditional information visualization tools, which are&nbsp;static, 2D, and based purely on visual information. This line of research investigates the&nbsp;design of new spatial visualization techniques and development of improved multimodal&nbsp;interfaces for commercial interests. Our MIV approach is based on cutting-edge virtual and&nbsp;augmented reality technologies and multimodal interfaces employing audio, touch, vision,&nbsp;or combinations thereof, to render information in an intuitive, meaningful, and accessible&nbsp;manner.<\/p>\n\n\n\n<p><em>Complete List of Published Work:<\/em><\/p>\n\n\n\n<p>E-pubs at:&nbsp;<a href=\"https:\/\/umaine.edu\/vemi\/publications\/\">https:\/\/umaine.edu\/vemi\/publications\/<\/a><\/p>\n\n\n\n<p>Google Scholar:&nbsp;<a href=\"https:\/\/scholar.google.com\/citations?user=jD95I7EAAAAJ\">https:\/\/scholar.google.com\/citations?user=jD95I7EAAAAJ<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Research Statement My research program is inherently interdisciplinary, combining principles from human perception,&nbsp;cognitive neuroscience, and human-computer interaction. My mission (and that of the VEMI Lab) is to&nbsp;envision, develop, and evaluate human-inspired nonvisual, enhanced visual, and Multimodal&nbsp;information access technologies for improving environmental awareness, spatial learning, and&nbsp;navigation. Our solutions make a difference in people\u2019s lives by providing [&hellip;]<\/p>\n","protected":false},"author":611,"featured_media":0,"parent":945,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_seopress_robots_primary_cat":"","_seopress_titles_title":"","_seopress_titles_desc":"","_seopress_robots_index":"","_kad_blocks_custom_css":"","_kad_blocks_head_custom_js":"","_kad_blocks_body_custom_js":"","_kad_blocks_footer_custom_js":"","_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"class_list":["post-5070","page","type-page","status-publish","hentry"],"taxonomy_info":[],"featured_image_src_large":false,"author_info":{"display_name":"eblackwood","author_link":"https:\/\/umaine.edu\/vemi\/author\/eblackwood\/"},"comment_info":0,"_links":{"self":[{"href":"https:\/\/umaine.edu\/vemi\/wp-json\/wp\/v2\/pages\/5070","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/umaine.edu\/vemi\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/umaine.edu\/vemi\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/umaine.edu\/vemi\/wp-json\/wp\/v2\/users\/611"}],"replies":[{"embeddable":true,"href":"https:\/\/umaine.edu\/vemi\/wp-json\/wp\/v2\/comments?post=5070"}],"version-history":[{"count":1,"href":"https:\/\/umaine.edu\/vemi\/wp-json\/wp\/v2\/pages\/5070\/revisions"}],"predecessor-version":[{"id":5074,"href":"https:\/\/umaine.edu\/vemi\/wp-json\/wp\/v2\/pages\/5070\/revisions\/5074"}],"up":[{"embeddable":true,"href":"https:\/\/umaine.edu\/vemi\/wp-json\/wp\/v2\/pages\/945"}],"wp:attachment":[{"href":"https:\/\/umaine.edu\/vemi\/wp-json\/wp\/v2\/media?parent=5070"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}