In future ubiquitous computing environments, our daily lives will be influenced by a lot of computer-supported services all over the place. To interact with those services intuitively, heterogeneous interaction techniques such as gesture recognition, auditory recognition and tangible user interfaces will appear. Besides, several kinds of services will support multiple input devices, not just one set of them. In such multi-modal environments, application programmers must take into account how to adapt heterogeneous input events to multi-modal services. We propose an input widget framework that provides high-level abstraction for heterogeneous input devices, that we call meta-inputs, for distributed multi-modal applications. Our framework provides generic and standard interfaces between input devices and services. It enables developers to deploy input devices and services independently. Also, our framework supports context-aware runtime adaptation to switch input devices to handle services dynamically.