Computers seventy years ago knew only what could be loaded into their memory, but today they can access an entire internet of information. This expansion of knowledge has made them indispensable tools and assistants. However, even today, computing devices know little about the physical world, especially the environment immediately around them due to a lack of perceptual capabilities. For this reason, they can tell you more about medieval literature and the traffic in Tokyo than the home in which they reside. This lack of perception limits how smart and useful they can be, especially in our everyday tasks that could be augmented with information and interactivity.
In this talk, I will present my research on sensing approaches that boost computer perception of the immediate physical world. Specifically, I have explored sensing technologies that allow one deployed sensor to cover a wide area for user activity and event recognition as well as sensing technologies that enable the manufacture of smarter everyday objects. Together, these sensing technologies allow computers to monitor the state, count, and intensity of activities, which in turn can enable higher-order applications such as personal informatics, accessibility, digital health, sustainability, and beyond.
Speaker Biography
Yang Zhang is a PhD candidate in the Human-Computer Interaction Institute at Carnegie Mellon University and is also a Qualcomm Innovation Fellow. His research lies in the technical aspects of Human Computer Interaction (HCI) with a focus on sensing technologies that enhance computing devices with knowledge of the physical world around them. His research has received 2 best paper and 4 honorable mention awards at top venues, and extensive media coverage from leading media outlets such as MIT Technology Review, Engadget, and The Wall Street Journal. As much of his research is highly applied, it has led to collaborations with industry partners, such as Facebook Reality Labs, Apple, and Microsoft Research. More information can be found on his website: https://yangzhang.dev.