Window View Quality Index Experimental Protocol and Global Dataset
Stefano Schiavon, Professor
Architecture
Applications for Spring 2024 are closed for this project.
Findings from the Center for the Built Environment demonstrate that having a “good” window view can enhance occupants' well-being and alleviate discomfort. While several metrics are used to evaluate different aspects of indoor environmental quality, there is no reliable index to quantify window view. The development of such an index will require extensive human subject experiments conducted within different scenarios to capture occupants' responses. Consequently, a novel approach is essential to comprehensively grasp the physical attributes compromising a good view and the perceived view quality as experienced by the occupants.
Hence, the primary objective of this project is twofold: (1) to establish and validate an experiment protocol in collaboration with our partners, ensuring consistency in experimental design and conduct and data compilation, and (2) to construct a View Quality Index (VQI) Global Dataset at a large scale. This dataset will describe occupant responses with views of varying quality levels, encompassing diverse combinations of view content, access, and clarity—the three fundamentals comprising view quality. Our ultimate goal is to have a metric that is applicable across a wide array of buildings and occupancies.
Our team will start by carrying out the experiments in the physical spaces (real buildings). Participants will be randomly assigned to five spaces with different views for each session and be guided to sit in a chair facing the windows. The participants will be surveyed, asking about their satisfaction with window view content, access, clarity, privacy, and overall view quality.
As a subsidiary component of this experiment, we also aim to validate whether there is a significant difference in respondent satisfaction when using VR or 2D images to provide the same window view compared to their responses in physical spaces. This approach is necessitated by the limitations in modifying test conditions within real buildings, which makes acquiring an adequate number of data samples challenging. The findings of this subset are to assess whether the delivery methods bring significant differences in the results. In the absence of substantial distinctions, we intend to utilize both VR and image-based methods to evaluate broader contexts and expand the dataset further.
This topic explores the convergence of architecture/design, environmental psychology, VR modeling, and data science. Our group will provide students with academic insights into conducting top-notch research. We are seeking SIX enthusiastic students passionate about these subjects and eager to contribute to a diverse range of tasks. Participants will acquire hands-on experience in the research project.
Day-to-day supervisor for this project: Sun Woo Chang, PhD student
External collaborator for this project:
Won Hee Ko, Assistant Professor at NJIT
Timur Dogan, Associate Professor at Cornell University
Michael Kent, SinBerBEST
Role: I am looking for 6 students excited about these topics and eager to assist with a variety of tasks. Students will gain hands-on experience in the research project including but not limited to: cleaning and analyzing the experimental data, visualizing the results (graphs, diagrams, illustrations).
Qualifications: This topic explores the convergence of architecture/design, environmental psychology, VR modeling, and data science. Our group will provide students with academic insights into conducting top-notch research. We are seeking SIX enthusiastic students passionate about these subjects and eager to contribute to a diverse range of tasks. Participants will acquire hands-on experience in the research project.
Day-to-day supervisor for this project: Sun Woo Chang, Ph.D. candidate
Hours: 6-8 hrs
Related website: https://www.linkedin.com/in/stefanoschiavon/
Related website: https://www.linkedin.com/pulse/framework-assess-window-view-quality-stefano-schiavon/