Sentient Objects: Towards Middleware
for Mobile Context-Aware Applications
by Gregory Biegel and Vinny Cahill
Despite rapid advances in sensor technologies making a wealth of environmental data available to applications, programmer support for the development of context-aware applications remains poor. The majority of such applications are developed in an ad-hoc, application-specific manner and there is an urgent need for middleware and services to support the development of such applications. Mobile, intelligent software components known as sentient objects provide one middleware abstraction that can ease the development of mobile context-aware applications.
The recent proliferation of cheap, small, and increasingly accurate sensor technologies is creating a new information revolution where applications that interact with the physical environment are becoming widespread. This awareness of the physical environment, and its use by applications in fulfilment of their goals, is known as context-awareness and a number of promising applications have appeared which make use of context information. Context-awareness is of particular importance in mobile environments where the operating environment is constantly changing due to the mobility of devices and the characteristics of wireless communication technologies. In such environments, context-awareness can enable applications to respond intelligently to variable bandwidth, unreliable connections and the economy of different connections. One of the greatest challenges in context-aware computing, and one that has not yet been adequately addressed, is the provision of middleware and services to support the application developer. The major problem lies in providing generic support for the acquisition and use of multiple fragments of information gleaned from (potentially unreliable) multi-modal sensors in a mobile environment.
The use of context information by applications in a mobile environment poses a number of challenges arising from the distributed and dynamic nature of sensors, the accuracy and resolution of sensors, and the fusion of output of multiple sensors in order to determine context. In addition, the mobile environment poses further challenges with regard to the dependability, predictability, and timeliness of communication. Middleware is required that provides abstractions for the fusion of sensor information to determine context, representation of context, and intelligent inference. Essential services that provide support for operation in a mobile environment, such as supporting the reliability of communication, are also required.
The sentient object model defines software abstractions that ease the use of sensor information, and associated actuation, by context-aware applications. At the heart of the model is an event based communication model that permits loose coupling between objects and consequently supports mobility and application evolution. The event-based communication model includes mechanisms for the specification of constraints on the propagation and delivery of events. A sensor is a software abstraction that encapsulates a hardware sensor device and produces software events in response to real-world stimuli. An actuator consumes software events and encapsulates a physical device capable of real-world actuation. A sentient object is an entity that can both consume and produce software events, and lies in some control path between at least one sensor and one actuator.
Internally, a sentient object consists of three major functional components:
- the sensory capture component is responsible fusing the outputs of multiple sensors, and uses probabilistic models, including Bayesian networks, to deal with inherent sensor uncertainties
- the context representation component maintains a hierarchy of potential contexts in which an object can exist, and the current active context
- the inference engine component is a production rule based inference engine and supporting knowledge base, giving objects the ability to intelligently control actuation based on their context.
Sentient objects are cooperative and in addition to traditional forms of communication, the sentient object model uses stigmergic coordination, or coordination via the environment. Stigmergic coordination does not rely on direct communication between objects, say via TCP/IP, but rather depends on objects being able to sense and make changes to their physical environment. Simple rules, embedded in the inference engine, then govern actuation and consequently behaviour, according to what has been sensed from the environment. This type of coordination is extremely robust and is particularly suited to mobile environments where traditional communication channels may not always be available or economical to use.
We are developing middleware and associated services, based on the sentient object model, to support context-aware applications. Such middleware and services will ease the task of developing applications based on context perception in a number of key areas. Applications will be insulated from both the complexities of physical sensors, actuators, and associated protocols, as well as the fusion of sensor data to infer context and reduce uncertainty. The middleware will also provide a high-level rule specification language through which intelligence may be added to objects without requiring knowledge of the intricacies of a specific production rule system. Furthermore, an intuitive visual programming tool is under development that will permit the rapid development of context-aware applications.
The work described in this article was partly supported by the Future and Emerging Technologies programme of the Commission of the European Union under research contract IST-20000-26031 (CORTEX - CO-operating Real-time senTient objects: architecture and EXperimental evaluation).
Ubiquitous computing at Trinity College Dublin: http://www.dsg.cs.tcd.ie/index.php?category_id=228
Cortex Project: http://cortex.di.fc.ul.pt/
Gregory Biegel, Trinity College Dublin
Tel: +353 1 6081531
Vinny Cahill, Trinity College Dublin
Tel: +353 1 6081795