The objects that surround us — desks, cars, shoes and coats — are deaf and blind; this limits their ability to adapt to our needs and thus to be useful. We have therefore developed computer systems that can follow people’s actions, recognizing their faces, hand gestures, and facial expressions, and learning their preferences and idiomatic methods of expression. Using this technology we have begun to make “smart rooms” and “smart clothes” that can help people in day-to-day life without chaining them to keyboards, microphones, pointing devices or special goggles.