We live in a dynamic world full of motion. To interact effectively (as man, animal, or machine), it is necessary to distinguish and identify the intentional or meaningful movements of animate creatures, and discard the non-animate movements such as the swaying of trees or falling leaves. Along these lines, many behaviors throughout the animal kingdom consist of performing and recognizing very specialized patterns of oscillatory motion. In this talk I will present a categorical approach to describing and recognizing oscillatory motions using a simple model having very specific and limited parameter values. Additionally, a complexity ordering on the motions determined from parameter specialization and observations of biological motion will be described. The categorical organization is used as a foundation for a machine vision system designed to recognize these motion patterns. Results of the perceptual method employing the constraints will be shown with synthetic and real video data. Lastly, I will describe an interactive system where users perform particular oscillatory motions to reactive virtual hummingbirds embodying the recognition models. This research explores the categorical structures of oscillatory motion to provide a simple, yet compelling, computational strategy for identifying motion patterns using machine vision. Such categorical methods are useful for characterizing structurally related patterns without the necessity of non-informed ad hoc models.