SlideShare a Scribd company logo
Dealing with the need for Infrastructural Support in Ambient Intelligence2 June 2009 @ School of Computing and Mathematics, University of Ulster, Jordanstown campusDr. Diego Lz-de-Ipiña Glz-de-ArtazaFaculty of Engineering (ESIDE), University of Deusto, Bilbaodipina@eside.deusto.eshttp://www.morelab.deusto.eshttp://www.smartlab.deusto.eshttp://paginaspersonales.deusto.es/dipina
IntroductionWhat is the endemic problem(s) of AmI precluding its wider deployment?Probably many factors but a very remarkable one is the ...“unfortunate” high demand on infrastructural support!!!SensorsActuatorsAutomation buses and protocolsWireless communication linksMiddlewareContext modelling and Reasoning enginesAnd so on and so forth ...
Research MotivationGiven that AmI is not possible without infrastructure ...How do we alleviate this “unfortunate” need?Our approach/research aim:Use and adapt low-cost off-the-shelf hardware infrastructure and combine it with intelligent middleware and interaction techniques to make “any” environment appear “intelligent”This talk describes several iterative research efforts addressing the infrastructure dependency issue
Talk OutlinePart 0: Bird’s-eye view of my research group and laboratory activities (5’)Part 1: Review my previous research work on solutions to address “the need for infrastructure of AmI” (35’)Iteration 1: Build your own sensing and reasoning infrastructureIteration 2: Concentrate on explicit user-environment interactionIteration 3: Leverage from Web 2.0 principles and map them to AmIIteration 4: Dealing with the heterogeneity, dynamic behaviour of existing instrumented environmentsIteration 5: Focus on a more specific application domain: AALPart 2: Review of current research lines & projects (10’)
MoreLab Research Group & SmartLab Research Laboratory @ University of Deusto
University of Deusto, BilbaoPrivate Jesuits' University founded in 1886 in Bilbao, Basque Country, SpainIt offers degrees on:Business & AdministrationLawPsychologyEngineeringSocial Sciences & Linguistics URL: http://www.deusto.es
Our Research Group:Created in 2003, there are 3 lecturers and 12 researchers:URL: http://www.morelab.deusto.esSpecialized on Mobile-Mediated Interaction, Internet of Things, Smart Objects, Semantic Middleware, AAL
Remote Control of Embedded Sensing Devices
Mobile-mediated Human-Environment Interaction
Mobile-mediated Human-Environment Interaction
Mobile-mediated Human-Environment Interaction
RealWidget: desktop widgets for the real world
RFID Glove (left) and RealWidget (right)
Souvenir-aware Google Earth
Prototyping an intelligent chair (FlexChair) withan embedded wireless sensor node
AmI-enabling Semantic Middleware
Home control for AAL
Our Research Lab: SmartLabA research laboratory focused on AmI researchAim: create an intelligent working space with a double purpose:Provide infrastructure to host and attract research projects related to AmIAssess the suitability of AmI as a mechanism to enrich and improve the working daily activities of a group of usersURL: http://www.smartlab.deusto.es
Our Research Lab: SmartLab
Iteration 1: Build your own essential sensing and reasoning infrastructurePhD Dissertation: Visual Sensing and Middleware Support for Sentient Computing
Iteration 1: Build your own essential sensing and reasoning infrastructureBasque GovernmentEducation DepartmentLaboratory for Communications Engineering (LCE)Cambridge University Engineering DepartmentEngland, UKAT&T LaboratoriesCambridgeGoals:build Sentient Spaces = computerised environments that sense & reactclose gap between user and computer by using contextmake ubiquitous computing reality through Sentient Computingby building your own low cost easily deployable infrastructure to make it feasible!!!Developed during PhD research in University of Cambridgehttp://www.cl.cam.ac.uk/research/dtg/Supervised by Prof. Andy Hopper
Sentient ComputingSentient Computing = computers + sensors + rules:
distributed sensors capture context, e.g. temperature, identity, location, etc
rules model how computers react to the stimuli provided by sensors
3 phases: (1) context capture, (2) context interpretation and (3) action triggering
PhD aim: to make viable widespread adoption of Sentient Computing through:
location sensor deployable everywhere and for everyone
middleware support for easier sentient application development:
rule-based monitoring of contextual events and associated reactions
user-bound service lifecycle control to assist in action triggeringTRIP: a Vision-based Location Sensor“Develop an easily-deployable location sensor technology with minimum hardware requirements and a low price”TRIP (Target Recognition using Image Processing):identifies and locates tagged objects in the field of view of a cameraRequires:off-the-shelf technology: cameras+PC+printerspecially designed 2-D circular markersuse of well-known Image Processing and Computer Vision algorithmsCheap, easily deployable  can tag everything:e.g. people, computers, books, stapler, etcProvides accurate 3-D pose of objects within 3 cm and 2° error
TRIPcode 2-D Markerradius encoding sectors1even-parity sectorssync sector20    *102011221210001TRIPcode of radius 58mm and ID 18,7952-D barcode with ternary codeEasy to identify bull’s-eye:invariant with respect to:RotationPerspectivehigh contrast  2 16 bit code encoding rings:1 sector synchronisation2 for even parity checking4 for bull’s-eye radius encoding39 = 19,683 valid codes
Target Recognition ProcessStage 0: Grab FrameStage 1: BinarizationStage 2: Binary Edge DetectionEllipse params:x (335.432), y (416.361) pixel coords                      a (8.9977), b (7.47734) pixel coords                         (15.91) degreesBull’s-eye radius: 0120 (15 mm)TRIPcode: 002200000  (1,944)Translation Vector (meters):         (Tx=0.0329608, Ty=0.043217, Tz=3.06935)Target Plane Orientation angles (degrees):(=-7.9175, =-32.1995, =-8.45592) d2Target: 3.06983 metersStages 4-7: Ellipse Fitting, Ellipse Concentricity Test, Code Deciphering and POSE_FROM_TRIPTAG methodStage 3: Edge Following & Filtering
Geometry POSE_FROM_TRIPTAG  method
A Rule Paradigm for Sentient ComputingSentient systems are reactive systems that perform actions in response to contextual events Respond to the stimuli provided by distributed sensors by triggering actions to satisfy the user’s expectations based on their current context, e.g. their identity, location or current activity Issues:Development of even simple sentient application usually involves the correlation of inputs provided from diverse context sources Observation:	Modus operandi of sentient applications: Wait until a pre-defined situation (a composite event pattern) is matched to trigger an action
ECA Rule Matching EngineSentient Applications respond to an ECA model:monitor contextual events coming from diverse sourcescorrelate events to determine when a contextual situation occurs:e.g. IF two or more people in meeting room + sound level high THEN meeting onineffective to force every app to handle same behaviour separatelySolution ECA Rule Matching Service:accepts rules specified by the user in the ECA language<rule> ::= {<event-pattern-list> => <action-list> }automatically registers with the necessary event sourcesnotifies clients with aggregated or composite events or executes actions when rules fire:aggregated event= new event summarizing a situationcomposite event= batch of events corresponding to a situation
ECA Service Architecture
Building a Sentient Jukebox with ECA Service“If it is Monday, a lab member is logged in and either he is working or it is raining outside, then play some cheerful music to raise the user’s spirits”within 15000 {/* Enforce events occur in 15 secs time span*/query PCMonitor$logged_in(user ?userID, host ?hostID) and   test(dayofweek = "Monday") and  Location$presence(user ?userID) before  /* a presence event must occur before any event on its RHS */  ((PCMonitor$keyboard_activity(host ?hostID, intensity ?i) and test(?i > 0.3)) or   (query WeatherMonitor$report(raining ?rainIntensity) and     test(?rainIntensity > 0.2)))=>notifyEvent(Jukebox$play_music(?userID, ?hostID, "ROCK"));}
Mapping of Rules to CLIPS(assert (rule (ruleID 0) (ruleRegTime 1005472984621)))(defrule rule0  (PCMonitor$logged_in (user ?userID) (host ?hostID)                       (timestamp ?time0#))  (test (eq (dayofweek) "Monday"))  (Location$presence (user ?userID) (timestamp ?time1#))  (test (> ?time1# 1005472984621))  (test (> ?time1# (- (curtime) 15000)))  (or (and (and (PCMonitor$keyboard_activity (host ?hostID)                                (intensity ?i) (timestamp ?time2#))                (test (> ?time2# 1005472984621))                (test (> ?time2# (- (curtime) 15000)))                (test (> ?time2# ?time1#)))           (test (> ?i 0.3)))      (and (WeatherMonitor$report (raining ?rainIntensity)                                   (timestamp ?time3#))           (test (> ?rainIntensity 0.2))))=>  (bind ?currentTime# (curtime))  (bind ?factID0# (assert (Jukebox$play_music# 0 ?currentTime#                                          ?userID ?hostID "ROCK")))  (notify-event ?factID0#))
LocALE FrameworkNeed to provide support for reactive behaviour of sentient systems:e.g. user-bound service activation after aggregated event arrivalLocALE = CORBA-based solution to object lifecycle & location control:hybrid of CORBA’s Object LifeCycle Service and Implementation Repositoryaddresses location-constrained service activation, deactivation and migrationadds mobility, fault-tolerance and load-balancing to objects in a location domaingenerates permanent object references (independent of object network location)undertakes transparent client request redirection upon object’s location changeuseful for third-party object location controllers:e.g. “migrate the TRIP parser to another host when the used host owner logs in”
Location-constrained Object Lifecycle ControlWhy is CORBA location transparency not always desirable?sometimes want to control where objects are first located and then relocatede.g. load-balancing or follow-me applicationsLocALE provides apps with location-constrained object lifecycle-control: apps specify on distributed object creation its initial location:within a host, e.g. hostDN("guinness") any host in an spatial container (room), e.g. roomID("Room_1")in any location domain’s host, e.g. hostDN("ANY") orin one of a given set of hosts, e.g. hostGroup("heineken", "guinness")… and restrictions under which an object can later be moved and/or recovered:LC_CONSTRAINT(RECOVERABLE | MOVABLE) # any host of location domainLC_CONSTRAINT(RECOVERABLE_WITHIN_ROOM | MOVABLE_WITHIN_ROOM)
LCE Active TRIPboardAugments whiteboard with interactive commands issued by placing special ringcodes in view of a camera observing whiteboardActivated by LocALE when person enters room or through web interfaceRegisters rules with the ECA Rule Matching Server:Location$TRIPevent(TRIPcode 52491, cameraID “MeetingRoomCam”) and Location$presence(user ?userID, room “LCE Meeting Room”) => notifyEvent(CaptureSnapshotEvent(“MeetingRoomCam”, ?userID))By means of LocALE, application’s TRIParser component is:created in a load-balanced way by randomly selecting one host in a hostGroupfault-tolerance by recreation of failed recogniser in another host
Follow-Me AudioProvides mobile users with music from the nearest set of speakersMP3 decoder and player follow the user to his new location.Uses TRIP as a real-time location and music selection deviceUses ECA Service to register contextual situations to be monitoredUses LocALE’s migration support
Iteration 2: Concentrate on explicit user-environment interaction: profit from what you already have in your hands!EMI2lets: a Reflective Framework for Enabling AmI
Iteration 2: Concentrate on explicit user-environment interactionLatest mobile devices used mainly for communication, entertainment or as electronic assistantsHowever, their increasing:Computational powerStorageCommunications (Wi-Fi, Bluetooth, GPRS)Multimedia capabilities (Camera, RFID reader)ExtensibilityMakes them ideal to act as intermediaries between us and environment:Aware (Sentient) DevicesPowerful devicesAlways with us anywhere at anytimeOur mobile devices can turn into our personal butlers!!!
MotivationBuild Smart Spacesand transform mobile devices intoUniversal Remote Controllers of Anything Anywhere at AnytimeMobile devices equipped with Bluetooth, cameras, barcode, GPS or RFID are sentient devices http://www.ctmd.deusto.es/mobilesenseA Smart Space is a container, either indoors or outdoors, of Smart ObjectsA Smart Object is an everyday object (e.g. door) or device augmented with some computational service. Definition of suitable AmI architectures may be a good starting point to make AmI reality
EMI2lets Platform IEMI2lets is a middleware to facilitate the development and deployment of mobile context-aware applications for AmI spaces. Software platform to:convert physical environments into AmI spaces augment daily life objects with computational servicestransform mobile devices into Smart Object remote controllers
EMI2lets Platform IIEMI2lets is an AmI-enabling middleware addresses the service discovery and interactionaspects required for active influenceon EMI2Objects Follows a Jini-like mechanism and Smart Client paradigm once a service is discovered, a proxy of it (an EMI2let) is downloaded into the user’s device (EMI2Proxy). An EMI2let is a mobile component transferred from a Smart Object to a nearby handheld device, which offers a graphical interface for the user to interact over that Smart Object
EMI2lets Deployment…EMI2let PlayerEMI2letEMI2letEMI2letEMI2letEMI2letEMI2letEMI2letEMI2letEMI2let runtimeEMI2let runtimeEMI2let back-endEMI2let back-endEMI2let back-endEMI2let back-endEMI2let back-endEMI2let back-endEMI2let Server…EMI2let FrameworkEMI2let DesignerSmart ObjectSmart ObjectEMI2let ServerEMI2let ServerEMI2let transferHandheld device(PDA,mobile phone)…Handheld device(PDA,mobile phone)EMI2let DesignerEMI2let to back-endcommunicationEMI2letEMI2let…EMI2let Player…EMI2let ServerEMI2let transfer……
How does it work?ReproductionDiscoverDownloadInteractUpload to the serverDevelopmentGPRS
EMI2lets Internal ArchitectureEMI2let Abstract Programming Model APIAbstract-to-Concrete MappingEMI2Protocol over Bluetooth RFCOMMSOAP over Wi-Fi, GPRS/UMTS or InternetTRIP-based Service DiscoveryUPnP Service DiscoveryRFID-based Service DiscoveryBluetooth Service Discovery (SDP)Interaction MappingDiscovery MappingPresentation MappingPersistence Mapping…
EMI2 Internals3-tier software architectureEMI2 framework defines 4 programming abstractions:Discovery CommunicationPresentation PersistencyAn EMI2letplug-in= abstraction implementation Common plug-ins: Bluetooth, Wi-Fi, UPnPSpecial purpose: TRIP (Target Recognition using Image Processing)Assembly fusionat runtimeReflection does the magic!!!
EMI2lets ApplicationsWe have created EMI2lets for different application domains:Accessibility: blind (bus stop), deaf (conference)Home/office automation: comfort (lights), entertainment (WMP), surveillance (camera)Industry: robotPublic spaces: restaurant, parking, airport
ConclusionEMI2lets = middleware providing universal active influence to mobile devices over Smart Objects:Transforms mobile devices into universal remote controllers.Enables both local and global access to those Smart Objects (anywhere/anytime).Independent and extensible to the underlying service discovery and interaction, graphical representation and persistence mechanisms. Enables AmI spaces using conventional readily-available hardware and software.Follows a “write once run in any device type” philosophy
Iteration 3: Easing AmI! Leverage from Web 2.0 principles and map them to AmIA Web 2.0 Platform to Enable Context-Aware Mobile Mash-ups
Iteration 3: Easing AmI! Leverage from Web 2.0 principlesIssues impending AmI wide deployment remain:AmI is possible if and only if:Environments are heavily instrumented with sensors and actuators Besides, to develop AmI apps continues being very hard!Still, mobile devices enable interaction anywhere at anytimeUser-controlled (explicit) & system-controlled (implicit)Is AmI possible without heavy and difficult instrumentation (or infrastructure-less)?YES, IT SHOULD if we want to increase AmI adoption!!!
Research AimAimLower the barrier of developing and deploying context-aware applications in uncontrolledglobal environmentsNot only my office, home, but what about my city, other companies, shopping centres, and so onHOW?Converging mobile and ubiquitous computing with Web 2.0 into Mobile Ubiquitous Web 2.0Adding context-aware social annotation to physical objects and locations in order to achieve AmI
What does it do?Annotate every physical object or spatial region with info or services Both indoors and outdoorsFilter annotations associated to surrounding resources based on user context and keyword filteringEnable user interaction with the smart object and spatial regions both in a PUSH and PULL mannerRequirementParticipation in a community of users interested in publishing and consuming context-aware empowered annotations and servicesSentient Graffiti
User’s viewGraffiti annotationDescriptions, keywords, contextual attributesGraffiti discovery and consumptionTRIP, RFID, NFC, GPSSystem’s viewContext-Aware FolksonomyTag/keyword-basedContext-Aware Mash-upGoogleMaps + our server back-endFunctionality
Architecture
Deployment Scenarios
Multi-modal InteractionSentient Graffiti simplifies human-to-environment interaction through four mobile mediated interaction modes:Pointing – the user points his camera phone to a bi-dimensional visual marker and obtains all the graffitis associated with it
Touching – the user touches an RFID tag with a mobile RFID reader bound to a mobile through Bluetooth (or NFC mobile) and obtains the relevant graffitis
Location-aware – mobiles equipped with a GPS in outdoor environments obtain the relevant nearby graffitis in a certain location range
Proximity-aware –the device retrieves all the graffitis published in nearby accessible Bluetooth servers when it is in Bluetooth range 54
Sentient Graffiti &Near-Field-Communication (NFC) is a combination of contact-less identification and interconnection technologies enabling wireless short-range communication between devices and smart objects. Range about 20 cm, 13.56 MHz band Enables 3 types of services:Service initiation and configurationP2P (peer to peer) data sharing and communicationPayment and ticketing Key enabler for the upcoming Internet of ThingsHow does Sentient Graffiti leverage from NFC?Touching interaction through NFCMIDP 2.0 Push Registry and NFC are combined to prevent users from starting mobile client before interacting with RFID augmented objectsProximity-aware interaction through NFCNokia NFC 6131 and Bluetooth SG servers are bound by simply touching an RFID tag with a mobile
Sentient Graffiti Web Client
Sentient Graffiti Web Client
Available prototypes:Marker-associated Graffitis: Virtual Notice Board 	Public/private graffitis, expiration time, remote review, user participationBluetooth-range Graffitis: University Services BoothIndividual, group and private graffitis, tag-based (OPEN_DAY)Location-range Graffitis: Bus AlerterThird-party SG clientesOther possible applications:City Tour: Bilbao_tourism Graffiti DomainConference: AmI-07 feedback, expiration after conferencePublicity: Graffiti expiration after N timesFriend meetingsDisco/stadium/office blogsApplication Types & Examples
Marker-associated Graffitis: Virtual Notice Board
Bluetooth-range Graffitis: University Booth
Location-Range Graffitis: Bus Alerter
Third-Party Mobile Application using Sentient Graffiti HTTP API
ConclusionsSentient Graffiti is a platform which promotes a more extensive adoption of AmI in global environments(cities, cars, hospitals, homes) without imposing deployment and maintenance hassles, offering the following features:Context-aware to filter and select most appropriate smart objects’ content and services for usersEncourages the creation of third party context-aware mash-upsthrough its HTTP APIBased on standard web technologieslowering its adoption barrierEnables multi-modal interaction between users and environment through generic mobile clientFurther work:Evaluate SG in a mobile social software communityAdopt Semantic Web context modeling
Iteration 4: Dealing with the heterogeneity, dynamic behaviour of existing instrumented environments, using available standardsSmartLab: Semantically Dynamic Infrastructure for Intelligent Environments
Iteration 4: Dealing with the heterogeneity, dynamic behaviour of existing instrumented environmentsMiddleware support for intelligent environment provision:Monitoring contextDetermine the user activity and high level contextAdapt the environment to the user
SmartLab Architecture
Layer 1: Sensors and Actuators
Device assortment common in intelligent environments:EIB/KNXAsterisk VoIP				VideoIP CamerasIndoor Location System (Ubisense)People wandering devices (Gerontek)Custom-built Devices (WSN)ChairDisplay braceletContainerEvery system has its own control interfaceHow do we interconnect all of them?Layer 1: Sensors and Actuators
Layer 2: Service Abstraction
Layer 2: Service AbstractionEvery device or system provides certain functionalities that we must transform into software services inside OSGi.Each device must provide a control bundle acting as a proxy inside the OSGi platform.All the native services of each device are wrapped in OSGi services.EIB/KNX Bus  BinaryLight, DimmableLight, Alarm, DoorSensorVideoIP HTTP Cameras CameraControllerVoiceIP Asterisk Server  AsteriskControllerGerontek Server  GerontekControllerUbisense COM Server  UbisenseControllerCustom-Built Devices  SmartChair, SmartContainer
Layer 2: Semantically-enhanced OSGi Bundles
Layer 3: Service and Context Management
Layer 3: Service ManagementDiscovery serviceSimple multicast protocol to obtain the bundles automatically.Installer serviceDecides whether the bundle should be installed or not.Local Repository serviceExtends the OBR service to provide a local cache for the discovered bundles.
Example: Service Management
Layer 3: Context ManagementContext information modelled with an ontologyBase coreTime and space relationsEventsNew services might extend the knowledge baseClasses and instancesBehaviour rulesConverts inferred information into OSGi events to which the different services can register.React accordingly to specific events.
Layer 3: Ontology
Example: Context Management
Context ManagementTwo knowledge generation methods in SmartLab:Ontological reasoningMakes use of RDF (rdf:domain), RFS (rdfs:subPropertyOf) and OWL (owl:TransitiveProperty) predicatesAllows to infer implicit knowledgeRule-based reasoningAllows defining relationship among entities in ontologyThree types of inference:Semantic rules – enable making ontological reasoning based on RDF and OWL theoretical modelsKnowledge extraction rules – extract new knowledge from ontology’s implicit oneEvent-inferring rules – generate aggregated events from the context in the knowledge base
Event-inferring Rule Example[EVENT-Meeting:(?meetingArea rdf:type <http://deusto.es/smartlab.owl#MeetingArea>),(?meetingArea <http://deusto.es/smartlab.owl#containsPersonItem> ?p1),(?meetingArea <http://deusto.es/smartlab.owl#containsPersonItem> ?p2), (?p1 <http://deusto.es/smartlab.owl#name> ?name1), (?p2 <http://deusto.es/smartlab.owl#name> ?name2),notEqual(?name1, ?name2),makeTemp(?meetingEvent)->(?meetingEvent rdf:type<http://deusto.es/smartlab.owl#LocableMeetingEvent>),(?meetingEvent <http://deusto.es/smartlab.owl#place> ?meetingArea),	(?meetingEvent <http://deusto.es/smartlab.owl#name> ?name1),(?meetingEvent <http://deusto.es/smartlab.owl#name> ?name2)	]
Performance Results
Layer 4: Service programmability, Management and Interaction
Layer 4: Service programmability, Management and InteractionImplicit interactionContext management generates events and some services are invoked automatically.Explicit interactionHTTP interface inside OSGi to invoke any service that exposes remote methodsDashboard-like GUI based on widgets (javascript cross-browser library) that are loaded when the services are active.
SmartLab Dashboard
ConclusionsSeveral extensions to the OSGi framework to support intelligent  and evolvable environment instrumentation have been presented:Devices or environment services expose a special semantic control bundle.OSGi bundles are discovered and act as a proxy providing semantic enhanced services.These services populate the system with new context information in order to infer new knowledge and generate events.Different services can register to receive context events and react to them accordingly.The platform knowledge has been modelled using ontologies and rules that can be extended and updated dynamically.For explicit interaction we have a HTTP interface or a Dashboard GUI based on widgets that can be used to interact with the platform.Semantic reasoning is powerful but costly computationally!!
Iteration 5: Focus on a more specific application domain: AALZAINGUNE: Infrastructural Support for Ambient Assisted Living
Some facts:By 2020, 25% of the EU's population will be over 65Spending on pensions, health and long-term care is expected to increase by 4-8% of GDP in coming decadesTotal expenditures tripling by 2050Older Europeans are important consumers with a wealth over €3000 billionAmbient Assisted Living (AAL) is a European Union initiative to address the needs of the ageing European populationElderly people should be able of living longer in their preferred environments, to enhance the quality of their livesCosts for society and public health systems should be reducedhttp://www.aal-europe.eu/To make AAL reality important to devise new easily-deployable middleware and hardware infrastructureIteration 5: Focus on a more specific application domain: AAL
Motivation
TecnológicoFundación DeustoTeknologikoa Deustu FundazioaAims to provide the software/hardware infrastructure (platform) required to easily deploy assistive services for elderly people @homehttp://www.tecnologico.deusto.es/projects/zainguneHOW?With an OSGi gateway powered by a rule-based reasoning engine which allows the coordination and cooperation of the home sensing and actuation devicesConsortium composed by:The ZAINGUNE Project
Heterogeneous device support: Agotek’s gerontek, Asterisk IP phones, IP cameras, KNX-EIB devices, ...Model assistive environments as a set of cooperating servicesProgrammability through aSOA-based approach. Apply natural explicit interaction mechanisms:Easy to use gadget-based and secure front-end, phone-mediated interaction, ... ZAINGUNE Goals
ZAINGUNE Multi-layered Architecture
ZAINGUNE Multi-layered ArchitectureThe hardware device layer is composed of the sensors and actuators which populate an environmentThe software platform layer transforms the functionality provided by the devices managed by Layer 1 into software services (bundles) which can be combined to give place to more advanced servicesEvery device within an AAL environment (home, residence, hospital) is encapsulated as a bundle or execution unit within our OSGi environment.
It includes two core bundles:
ZainguneController – core component of ZAINGUNE server, manages and controls access to the components (OSGi bundles) supported by ZAINGUNE.
ZainguneServlet – behaves as an Web Service/OSGi gateway exporting the OSGi bundle functionality through Web Services and generates web front-ends (based on JavaScript X library) of every bundleThe applications environment layer includes all the possible application scenarios for the ZAINGUNE infrastructurePublic housing flat for disabled or elderly people, hospitals, residences and so onMulti-layered Architecture
Web gadget-based interaction – an easy to use web gadget-based environment controller divided into the following sections:Help – single button to request helpCommunications – call by photo, email and SMSHome control – control of every device by containerSurveillance – both local and remote IP camera controlPhone touchpad- and voice-based interaction – the integration of Asterisk in ZAINGUNE provides: feedback through phone speakers, house control through keystrokes and voice commands Alert bracelet-based interaction – special purpose device designed for assistance seeking and alert notificationMulti-modal Environment Interaction
Multi-modal Environment Interaction
Multi-modal Environment Interaction
A custom-built device combining an organic screen (µOLED-96-G1 of 4D Systems) with a WSN mote based on Mica2DOT capable of displaying messages broadcasted by nearby motes.Every inhabitant may carry an alert bracelet for:Assistance seekingAlert notificationA future work option is to add living signal monitoring sensors (e.g. Nonin 4100 Avant Module) to such deviceZAINGUNE Alert Bracelet
The adoption of a rule-based engine in ZAINGUNE offers two main advantages: Decouples environment configuration from programmabilityEnables environment-initiated proactive reactionsEnvironment intelligence is encapsulated as a set of rules which trigger when certain sensorial situations are matchedLHS represents sensing conditions whilst RHS depicts actions to be undertaken when the LHS situations are matchedThis rule-based paradigm is employed to configure the reactive behaviour of a ZAINGUNE-controlled environment: efficient management of energy resourcessecurity at home or danger situation preventionIntelligence through Rule-Based Reasoning

More Related Content

PPT
SofwarøSfera Presentation
PDF
Enabling Citizen-empowered Apps over Linked Data
PDF
IES Cities Project Overview and API: IES Cities Hackathon, Zaragoza
PDF
Collaboration centred cities through urban apps based on open and user-genera...
PDF
DeustoTech-INTERNET, MORElab research group
PDF
WeLive: Citizens Designing Cities
PDF
Promoting Sustainability through Energy-aware Linked Data Devices
PDF
Towards Ambient Assisted Cities: Smarter, more Sustainable, Collaborative and...
SofwarøSfera Presentation
Enabling Citizen-empowered Apps over Linked Data
IES Cities Project Overview and API: IES Cities Hackathon, Zaragoza
Collaboration centred cities through urban apps based on open and user-genera...
DeustoTech-INTERNET, MORElab research group
WeLive: Citizens Designing Cities
Promoting Sustainability through Energy-aware Linked Data Devices
Towards Ambient Assisted Cities: Smarter, more Sustainable, Collaborative and...

What's hot (20)

PDF
Citizen-centric Linked Data Services for Smarter Cities
PDF
Towards Smarter Inclusive Cities: Internet of Things, Web of Data & Citizen P...
PDF
Internet of Things, Web of Data & Citizen Participation as Enablers of Smart ...
PDF
Internet de las Cosas: del Concepto a la Realidad
PPTX
Towards Ambient Assisted Cities and Citizens
PDF
Enabling Smarter Cities through Internet of Things, Web of Data & Citizen Par...
PDF
Towards Ambient Assisted Cities and Citizens
PDF
Bringing together smart things and people to realize smarter environments sho...
PDF
The quest for Ubiquitous Computing: from Ambient Intelligence to the combinat...
PDF
Presentación InnoLab Bilbao BetaBeers: Smart Cities DeustoTech
PDF
Towards Citizen Co-created Public Service Apps
PDF
Technological pillars to enable Smarter (Collaborative + Inclusive) Environme...
PDF
Internet de las Cosas: del Concepto a la Realidad
PPTX
Smartweek 2014 London: EU FP7 SocIoTal project overview - Michele Nati - Univ...
PDF
Empowering citizens to turn them into cocreators of demand driven public serv...
PDF
Towards more Elderly-friendly Ambient Assisted Cities
PPT
What makes smart cities “Smart”?
PDF
Human-centric Collaborative Services : IoT, Broad Data, Crowdsourcing, Engage...
PDF
Transiting to Open Knowledge by fostering Collaboration through CO-CREATION
PDF
Smart Cities and Open Data
Citizen-centric Linked Data Services for Smarter Cities
Towards Smarter Inclusive Cities: Internet of Things, Web of Data & Citizen P...
Internet of Things, Web of Data & Citizen Participation as Enablers of Smart ...
Internet de las Cosas: del Concepto a la Realidad
Towards Ambient Assisted Cities and Citizens
Enabling Smarter Cities through Internet of Things, Web of Data & Citizen Par...
Towards Ambient Assisted Cities and Citizens
Bringing together smart things and people to realize smarter environments sho...
The quest for Ubiquitous Computing: from Ambient Intelligence to the combinat...
Presentación InnoLab Bilbao BetaBeers: Smart Cities DeustoTech
Towards Citizen Co-created Public Service Apps
Technological pillars to enable Smarter (Collaborative + Inclusive) Environme...
Internet de las Cosas: del Concepto a la Realidad
Smartweek 2014 London: EU FP7 SocIoTal project overview - Michele Nati - Univ...
Empowering citizens to turn them into cocreators of demand driven public serv...
Towards more Elderly-friendly Ambient Assisted Cities
What makes smart cities “Smart”?
Human-centric Collaborative Services : IoT, Broad Data, Crowdsourcing, Engage...
Transiting to Open Knowledge by fostering Collaboration through CO-CREATION
Smart Cities and Open Data
Ad

Similar to Dealing with the need for Infrastructural Support in Ambient Intelligence (20)

PDF
Paams2011 pvalente-presentation-slides1
PDF
Sapere project-introduction-dec-2010
PPT
Context-Aware Computing
PDF
A survey on context aware system & intelligent Middleware’s
PDF
Selected Pervasive Computing edited 03.pdf
PPTX
Rule-based Real-Time Activity Recognition in a Smart Home Environment
PDF
Epics introduction-dec-2010
PDF
UBIQUITOUS COMPUTING Its Paradigm, Systems & Middleware
PDF
Pervasive Computing
PDF
MobiSys Group Presentation
PDF
The DemaWare Service-Oriented AAL Platform for People with Dementia
PPT
Ukd2008 18-9-08 andrea
PPT
Caaa07 Presentation February Final
PPTX
A Preliminary Study on Architecting Cyber-Physical Systems
KEY
Semantic Web for AAL
PPTX
Exo cortex
PDF
PPT
Semantics in Sensor Networks
PPSX
Phd defence presentation
PDF
SOFIA project INDRA NEO Publication
Paams2011 pvalente-presentation-slides1
Sapere project-introduction-dec-2010
Context-Aware Computing
A survey on context aware system & intelligent Middleware’s
Selected Pervasive Computing edited 03.pdf
Rule-based Real-Time Activity Recognition in a Smart Home Environment
Epics introduction-dec-2010
UBIQUITOUS COMPUTING Its Paradigm, Systems & Middleware
Pervasive Computing
MobiSys Group Presentation
The DemaWare Service-Oriented AAL Platform for People with Dementia
Ukd2008 18-9-08 andrea
Caaa07 Presentation February Final
A Preliminary Study on Architecting Cyber-Physical Systems
Semantic Web for AAL
Exo cortex
Semantics in Sensor Networks
Phd defence presentation
SOFIA project INDRA NEO Publication
Ad

More from Diego López-de-Ipiña González-de-Artaza (20)

PDF
Validating a Citizen Observatories enabling Platform by completing a Citizen ...
PDF
Citizen Observatories to encourage more democratic data evidence-based decisi...
PDF
Democratizing co-production of thematic co-explorations for Citizen Observato...
PDF
Digital Twin aiding more effective Digital Maintenance
PDF
Humanized Computing: the path towards higher collaboration and reciprocal lea...
PDF
Generative AI How It's Changing Our World and What It Means for You_final.pdf
PDF
Democratizing Co-Production Of Sustainable Public Services
PDF
Ontological Infrastructure for Interoperable Research Information Systems: HE...
PDF
Fostering multi-stakeholder collaboration through co-production and rewarding
PDF
A Collaborative Environment to Boost Sustainable Engaged Research & Co-Produc...
PDF
A Collaborative Environment to Boost Co-Production of Sustainable Public Serv...
PDF
PrácticaParticipación-INTERLINK-realizingcoproduction_final.pdf
PDF
INTERLINK: Engaged Research through co-production
PDF
Internet of People: towards a Human-centric computing for Social Good
PDF
Boosting data-driven innovation in Europe with the support of DIHs
PDF
Social Coin: Blockchain-mediated incentivization of citizens for sustainable ...
PDF
Role of Data Incubators shaping European Data Spaces: EDI & REACH cases
PDF
Transiting to SMART COMMUNITIES by fostering Collaboration & CO-CREATION for ...
PDF
ROH: Proceso de Ingeniería Ontológica & Uso y Extensión de Vocabularios Estándar
PDF
Introduction to FAIR Data and Research Objects
Validating a Citizen Observatories enabling Platform by completing a Citizen ...
Citizen Observatories to encourage more democratic data evidence-based decisi...
Democratizing co-production of thematic co-explorations for Citizen Observato...
Digital Twin aiding more effective Digital Maintenance
Humanized Computing: the path towards higher collaboration and reciprocal lea...
Generative AI How It's Changing Our World and What It Means for You_final.pdf
Democratizing Co-Production Of Sustainable Public Services
Ontological Infrastructure for Interoperable Research Information Systems: HE...
Fostering multi-stakeholder collaboration through co-production and rewarding
A Collaborative Environment to Boost Sustainable Engaged Research & Co-Produc...
A Collaborative Environment to Boost Co-Production of Sustainable Public Serv...
PrácticaParticipación-INTERLINK-realizingcoproduction_final.pdf
INTERLINK: Engaged Research through co-production
Internet of People: towards a Human-centric computing for Social Good
Boosting data-driven innovation in Europe with the support of DIHs
Social Coin: Blockchain-mediated incentivization of citizens for sustainable ...
Role of Data Incubators shaping European Data Spaces: EDI & REACH cases
Transiting to SMART COMMUNITIES by fostering Collaboration & CO-CREATION for ...
ROH: Proceso de Ingeniería Ontológica & Uso y Extensión de Vocabularios Estándar
Introduction to FAIR Data and Research Objects

Recently uploaded (20)

PPTX
Machine Learning_overview_presentation.pptx
PPTX
MYSQL Presentation for SQL database connectivity
PPTX
20250228 LYD VKU AI Blended-Learning.pptx
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PDF
Assigned Numbers - 2025 - Bluetooth® Document
PDF
Empathic Computing: Creating Shared Understanding
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PPTX
A Presentation on Artificial Intelligence
PPTX
Programs and apps: productivity, graphics, security and other tools
PDF
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
PDF
MIND Revenue Release Quarter 2 2025 Press Release
PDF
NewMind AI Weekly Chronicles - August'25-Week II
PDF
Review of recent advances in non-invasive hemoglobin estimation
PDF
cuic standard and advanced reporting.pdf
PDF
Approach and Philosophy of On baking technology
PPTX
Digital-Transformation-Roadmap-for-Companies.pptx
PDF
Machine learning based COVID-19 study performance prediction
PDF
Dropbox Q2 2025 Financial Results & Investor Presentation
PDF
Agricultural_Statistics_at_a_Glance_2022_0.pdf
Machine Learning_overview_presentation.pptx
MYSQL Presentation for SQL database connectivity
20250228 LYD VKU AI Blended-Learning.pptx
Encapsulation_ Review paper, used for researhc scholars
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
Assigned Numbers - 2025 - Bluetooth® Document
Empathic Computing: Creating Shared Understanding
Reach Out and Touch Someone: Haptics and Empathic Computing
A Presentation on Artificial Intelligence
Programs and apps: productivity, graphics, security and other tools
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
MIND Revenue Release Quarter 2 2025 Press Release
NewMind AI Weekly Chronicles - August'25-Week II
Review of recent advances in non-invasive hemoglobin estimation
cuic standard and advanced reporting.pdf
Approach and Philosophy of On baking technology
Digital-Transformation-Roadmap-for-Companies.pptx
Machine learning based COVID-19 study performance prediction
Dropbox Q2 2025 Financial Results & Investor Presentation
Agricultural_Statistics_at_a_Glance_2022_0.pdf

Dealing with the need for Infrastructural Support in Ambient Intelligence

  • 1. Dealing with the need for Infrastructural Support in Ambient Intelligence2 June 2009 @ School of Computing and Mathematics, University of Ulster, Jordanstown campusDr. Diego Lz-de-Ipiña Glz-de-ArtazaFaculty of Engineering (ESIDE), University of Deusto, Bilbaodipina@eside.deusto.eshttp://www.morelab.deusto.eshttp://www.smartlab.deusto.eshttp://paginaspersonales.deusto.es/dipina
  • 2. IntroductionWhat is the endemic problem(s) of AmI precluding its wider deployment?Probably many factors but a very remarkable one is the ...“unfortunate” high demand on infrastructural support!!!SensorsActuatorsAutomation buses and protocolsWireless communication linksMiddlewareContext modelling and Reasoning enginesAnd so on and so forth ...
  • 3. Research MotivationGiven that AmI is not possible without infrastructure ...How do we alleviate this “unfortunate” need?Our approach/research aim:Use and adapt low-cost off-the-shelf hardware infrastructure and combine it with intelligent middleware and interaction techniques to make “any” environment appear “intelligent”This talk describes several iterative research efforts addressing the infrastructure dependency issue
  • 4. Talk OutlinePart 0: Bird’s-eye view of my research group and laboratory activities (5’)Part 1: Review my previous research work on solutions to address “the need for infrastructure of AmI” (35’)Iteration 1: Build your own sensing and reasoning infrastructureIteration 2: Concentrate on explicit user-environment interactionIteration 3: Leverage from Web 2.0 principles and map them to AmIIteration 4: Dealing with the heterogeneity, dynamic behaviour of existing instrumented environmentsIteration 5: Focus on a more specific application domain: AALPart 2: Review of current research lines & projects (10’)
  • 5. MoreLab Research Group & SmartLab Research Laboratory @ University of Deusto
  • 6. University of Deusto, BilbaoPrivate Jesuits' University founded in 1886 in Bilbao, Basque Country, SpainIt offers degrees on:Business & AdministrationLawPsychologyEngineeringSocial Sciences & Linguistics URL: http://www.deusto.es
  • 7. Our Research Group:Created in 2003, there are 3 lecturers and 12 researchers:URL: http://www.morelab.deusto.esSpecialized on Mobile-Mediated Interaction, Internet of Things, Smart Objects, Semantic Middleware, AAL
  • 8. Remote Control of Embedded Sensing Devices
  • 12. RealWidget: desktop widgets for the real world
  • 13. RFID Glove (left) and RealWidget (right)
  • 15. Prototyping an intelligent chair (FlexChair) withan embedded wireless sensor node
  • 18. Our Research Lab: SmartLabA research laboratory focused on AmI researchAim: create an intelligent working space with a double purpose:Provide infrastructure to host and attract research projects related to AmIAssess the suitability of AmI as a mechanism to enrich and improve the working daily activities of a group of usersURL: http://www.smartlab.deusto.es
  • 19. Our Research Lab: SmartLab
  • 20. Iteration 1: Build your own essential sensing and reasoning infrastructurePhD Dissertation: Visual Sensing and Middleware Support for Sentient Computing
  • 21. Iteration 1: Build your own essential sensing and reasoning infrastructureBasque GovernmentEducation DepartmentLaboratory for Communications Engineering (LCE)Cambridge University Engineering DepartmentEngland, UKAT&T LaboratoriesCambridgeGoals:build Sentient Spaces = computerised environments that sense & reactclose gap between user and computer by using contextmake ubiquitous computing reality through Sentient Computingby building your own low cost easily deployable infrastructure to make it feasible!!!Developed during PhD research in University of Cambridgehttp://www.cl.cam.ac.uk/research/dtg/Supervised by Prof. Andy Hopper
  • 22. Sentient ComputingSentient Computing = computers + sensors + rules:
  • 23. distributed sensors capture context, e.g. temperature, identity, location, etc
  • 24. rules model how computers react to the stimuli provided by sensors
  • 25. 3 phases: (1) context capture, (2) context interpretation and (3) action triggering
  • 26. PhD aim: to make viable widespread adoption of Sentient Computing through:
  • 27. location sensor deployable everywhere and for everyone
  • 28. middleware support for easier sentient application development:
  • 29. rule-based monitoring of contextual events and associated reactions
  • 30. user-bound service lifecycle control to assist in action triggeringTRIP: a Vision-based Location Sensor“Develop an easily-deployable location sensor technology with minimum hardware requirements and a low price”TRIP (Target Recognition using Image Processing):identifies and locates tagged objects in the field of view of a cameraRequires:off-the-shelf technology: cameras+PC+printerspecially designed 2-D circular markersuse of well-known Image Processing and Computer Vision algorithmsCheap, easily deployable  can tag everything:e.g. people, computers, books, stapler, etcProvides accurate 3-D pose of objects within 3 cm and 2° error
  • 31. TRIPcode 2-D Markerradius encoding sectors1even-parity sectorssync sector20 *102011221210001TRIPcode of radius 58mm and ID 18,7952-D barcode with ternary codeEasy to identify bull’s-eye:invariant with respect to:RotationPerspectivehigh contrast 2 16 bit code encoding rings:1 sector synchronisation2 for even parity checking4 for bull’s-eye radius encoding39 = 19,683 valid codes
  • 32. Target Recognition ProcessStage 0: Grab FrameStage 1: BinarizationStage 2: Binary Edge DetectionEllipse params:x (335.432), y (416.361) pixel coords a (8.9977), b (7.47734) pixel coords  (15.91) degreesBull’s-eye radius: 0120 (15 mm)TRIPcode: 002200000 (1,944)Translation Vector (meters): (Tx=0.0329608, Ty=0.043217, Tz=3.06935)Target Plane Orientation angles (degrees):(=-7.9175, =-32.1995, =-8.45592) d2Target: 3.06983 metersStages 4-7: Ellipse Fitting, Ellipse Concentricity Test, Code Deciphering and POSE_FROM_TRIPTAG methodStage 3: Edge Following & Filtering
  • 34. A Rule Paradigm for Sentient ComputingSentient systems are reactive systems that perform actions in response to contextual events Respond to the stimuli provided by distributed sensors by triggering actions to satisfy the user’s expectations based on their current context, e.g. their identity, location or current activity Issues:Development of even simple sentient application usually involves the correlation of inputs provided from diverse context sources Observation: Modus operandi of sentient applications: Wait until a pre-defined situation (a composite event pattern) is matched to trigger an action
  • 35. ECA Rule Matching EngineSentient Applications respond to an ECA model:monitor contextual events coming from diverse sourcescorrelate events to determine when a contextual situation occurs:e.g. IF two or more people in meeting room + sound level high THEN meeting onineffective to force every app to handle same behaviour separatelySolution ECA Rule Matching Service:accepts rules specified by the user in the ECA language<rule> ::= {<event-pattern-list> => <action-list> }automatically registers with the necessary event sourcesnotifies clients with aggregated or composite events or executes actions when rules fire:aggregated event= new event summarizing a situationcomposite event= batch of events corresponding to a situation
  • 37. Building a Sentient Jukebox with ECA Service“If it is Monday, a lab member is logged in and either he is working or it is raining outside, then play some cheerful music to raise the user’s spirits”within 15000 {/* Enforce events occur in 15 secs time span*/query PCMonitor$logged_in(user ?userID, host ?hostID) and test(dayofweek = "Monday") and Location$presence(user ?userID) before /* a presence event must occur before any event on its RHS */ ((PCMonitor$keyboard_activity(host ?hostID, intensity ?i) and test(?i > 0.3)) or (query WeatherMonitor$report(raining ?rainIntensity) and test(?rainIntensity > 0.2)))=>notifyEvent(Jukebox$play_music(?userID, ?hostID, "ROCK"));}
  • 38. Mapping of Rules to CLIPS(assert (rule (ruleID 0) (ruleRegTime 1005472984621)))(defrule rule0 (PCMonitor$logged_in (user ?userID) (host ?hostID) (timestamp ?time0#)) (test (eq (dayofweek) "Monday")) (Location$presence (user ?userID) (timestamp ?time1#)) (test (> ?time1# 1005472984621)) (test (> ?time1# (- (curtime) 15000))) (or (and (and (PCMonitor$keyboard_activity (host ?hostID) (intensity ?i) (timestamp ?time2#)) (test (> ?time2# 1005472984621)) (test (> ?time2# (- (curtime) 15000))) (test (> ?time2# ?time1#))) (test (> ?i 0.3))) (and (WeatherMonitor$report (raining ?rainIntensity) (timestamp ?time3#)) (test (> ?rainIntensity 0.2))))=> (bind ?currentTime# (curtime)) (bind ?factID0# (assert (Jukebox$play_music# 0 ?currentTime# ?userID ?hostID "ROCK"))) (notify-event ?factID0#))
  • 39. LocALE FrameworkNeed to provide support for reactive behaviour of sentient systems:e.g. user-bound service activation after aggregated event arrivalLocALE = CORBA-based solution to object lifecycle & location control:hybrid of CORBA’s Object LifeCycle Service and Implementation Repositoryaddresses location-constrained service activation, deactivation and migrationadds mobility, fault-tolerance and load-balancing to objects in a location domaingenerates permanent object references (independent of object network location)undertakes transparent client request redirection upon object’s location changeuseful for third-party object location controllers:e.g. “migrate the TRIP parser to another host when the used host owner logs in”
  • 40. Location-constrained Object Lifecycle ControlWhy is CORBA location transparency not always desirable?sometimes want to control where objects are first located and then relocatede.g. load-balancing or follow-me applicationsLocALE provides apps with location-constrained object lifecycle-control: apps specify on distributed object creation its initial location:within a host, e.g. hostDN("guinness") any host in an spatial container (room), e.g. roomID("Room_1")in any location domain’s host, e.g. hostDN("ANY") orin one of a given set of hosts, e.g. hostGroup("heineken", "guinness")… and restrictions under which an object can later be moved and/or recovered:LC_CONSTRAINT(RECOVERABLE | MOVABLE) # any host of location domainLC_CONSTRAINT(RECOVERABLE_WITHIN_ROOM | MOVABLE_WITHIN_ROOM)
  • 41. LCE Active TRIPboardAugments whiteboard with interactive commands issued by placing special ringcodes in view of a camera observing whiteboardActivated by LocALE when person enters room or through web interfaceRegisters rules with the ECA Rule Matching Server:Location$TRIPevent(TRIPcode 52491, cameraID “MeetingRoomCam”) and Location$presence(user ?userID, room “LCE Meeting Room”) => notifyEvent(CaptureSnapshotEvent(“MeetingRoomCam”, ?userID))By means of LocALE, application’s TRIParser component is:created in a load-balanced way by randomly selecting one host in a hostGroupfault-tolerance by recreation of failed recogniser in another host
  • 42. Follow-Me AudioProvides mobile users with music from the nearest set of speakersMP3 decoder and player follow the user to his new location.Uses TRIP as a real-time location and music selection deviceUses ECA Service to register contextual situations to be monitoredUses LocALE’s migration support
  • 43. Iteration 2: Concentrate on explicit user-environment interaction: profit from what you already have in your hands!EMI2lets: a Reflective Framework for Enabling AmI
  • 44. Iteration 2: Concentrate on explicit user-environment interactionLatest mobile devices used mainly for communication, entertainment or as electronic assistantsHowever, their increasing:Computational powerStorageCommunications (Wi-Fi, Bluetooth, GPRS)Multimedia capabilities (Camera, RFID reader)ExtensibilityMakes them ideal to act as intermediaries between us and environment:Aware (Sentient) DevicesPowerful devicesAlways with us anywhere at anytimeOur mobile devices can turn into our personal butlers!!!
  • 45. MotivationBuild Smart Spacesand transform mobile devices intoUniversal Remote Controllers of Anything Anywhere at AnytimeMobile devices equipped with Bluetooth, cameras, barcode, GPS or RFID are sentient devices http://www.ctmd.deusto.es/mobilesenseA Smart Space is a container, either indoors or outdoors, of Smart ObjectsA Smart Object is an everyday object (e.g. door) or device augmented with some computational service. Definition of suitable AmI architectures may be a good starting point to make AmI reality
  • 46. EMI2lets Platform IEMI2lets is a middleware to facilitate the development and deployment of mobile context-aware applications for AmI spaces. Software platform to:convert physical environments into AmI spaces augment daily life objects with computational servicestransform mobile devices into Smart Object remote controllers
  • 47. EMI2lets Platform IIEMI2lets is an AmI-enabling middleware addresses the service discovery and interactionaspects required for active influenceon EMI2Objects Follows a Jini-like mechanism and Smart Client paradigm once a service is discovered, a proxy of it (an EMI2let) is downloaded into the user’s device (EMI2Proxy). An EMI2let is a mobile component transferred from a Smart Object to a nearby handheld device, which offers a graphical interface for the user to interact over that Smart Object
  • 48. EMI2lets Deployment…EMI2let PlayerEMI2letEMI2letEMI2letEMI2letEMI2letEMI2letEMI2letEMI2letEMI2let runtimeEMI2let runtimeEMI2let back-endEMI2let back-endEMI2let back-endEMI2let back-endEMI2let back-endEMI2let back-endEMI2let Server…EMI2let FrameworkEMI2let DesignerSmart ObjectSmart ObjectEMI2let ServerEMI2let ServerEMI2let transferHandheld device(PDA,mobile phone)…Handheld device(PDA,mobile phone)EMI2let DesignerEMI2let to back-endcommunicationEMI2letEMI2let…EMI2let Player…EMI2let ServerEMI2let transfer……
  • 49. How does it work?ReproductionDiscoverDownloadInteractUpload to the serverDevelopmentGPRS
  • 50. EMI2lets Internal ArchitectureEMI2let Abstract Programming Model APIAbstract-to-Concrete MappingEMI2Protocol over Bluetooth RFCOMMSOAP over Wi-Fi, GPRS/UMTS or InternetTRIP-based Service DiscoveryUPnP Service DiscoveryRFID-based Service DiscoveryBluetooth Service Discovery (SDP)Interaction MappingDiscovery MappingPresentation MappingPersistence Mapping…
  • 51. EMI2 Internals3-tier software architectureEMI2 framework defines 4 programming abstractions:Discovery CommunicationPresentation PersistencyAn EMI2letplug-in= abstraction implementation Common plug-ins: Bluetooth, Wi-Fi, UPnPSpecial purpose: TRIP (Target Recognition using Image Processing)Assembly fusionat runtimeReflection does the magic!!!
  • 52. EMI2lets ApplicationsWe have created EMI2lets for different application domains:Accessibility: blind (bus stop), deaf (conference)Home/office automation: comfort (lights), entertainment (WMP), surveillance (camera)Industry: robotPublic spaces: restaurant, parking, airport
  • 53. ConclusionEMI2lets = middleware providing universal active influence to mobile devices over Smart Objects:Transforms mobile devices into universal remote controllers.Enables both local and global access to those Smart Objects (anywhere/anytime).Independent and extensible to the underlying service discovery and interaction, graphical representation and persistence mechanisms. Enables AmI spaces using conventional readily-available hardware and software.Follows a “write once run in any device type” philosophy
  • 54. Iteration 3: Easing AmI! Leverage from Web 2.0 principles and map them to AmIA Web 2.0 Platform to Enable Context-Aware Mobile Mash-ups
  • 55. Iteration 3: Easing AmI! Leverage from Web 2.0 principlesIssues impending AmI wide deployment remain:AmI is possible if and only if:Environments are heavily instrumented with sensors and actuators Besides, to develop AmI apps continues being very hard!Still, mobile devices enable interaction anywhere at anytimeUser-controlled (explicit) & system-controlled (implicit)Is AmI possible without heavy and difficult instrumentation (or infrastructure-less)?YES, IT SHOULD if we want to increase AmI adoption!!!
  • 56. Research AimAimLower the barrier of developing and deploying context-aware applications in uncontrolledglobal environmentsNot only my office, home, but what about my city, other companies, shopping centres, and so onHOW?Converging mobile and ubiquitous computing with Web 2.0 into Mobile Ubiquitous Web 2.0Adding context-aware social annotation to physical objects and locations in order to achieve AmI
  • 57. What does it do?Annotate every physical object or spatial region with info or services Both indoors and outdoorsFilter annotations associated to surrounding resources based on user context and keyword filteringEnable user interaction with the smart object and spatial regions both in a PUSH and PULL mannerRequirementParticipation in a community of users interested in publishing and consuming context-aware empowered annotations and servicesSentient Graffiti
  • 58. User’s viewGraffiti annotationDescriptions, keywords, contextual attributesGraffiti discovery and consumptionTRIP, RFID, NFC, GPSSystem’s viewContext-Aware FolksonomyTag/keyword-basedContext-Aware Mash-upGoogleMaps + our server back-endFunctionality
  • 61. Multi-modal InteractionSentient Graffiti simplifies human-to-environment interaction through four mobile mediated interaction modes:Pointing – the user points his camera phone to a bi-dimensional visual marker and obtains all the graffitis associated with it
  • 62. Touching – the user touches an RFID tag with a mobile RFID reader bound to a mobile through Bluetooth (or NFC mobile) and obtains the relevant graffitis
  • 63. Location-aware – mobiles equipped with a GPS in outdoor environments obtain the relevant nearby graffitis in a certain location range
  • 64. Proximity-aware –the device retrieves all the graffitis published in nearby accessible Bluetooth servers when it is in Bluetooth range 54
  • 65. Sentient Graffiti &Near-Field-Communication (NFC) is a combination of contact-less identification and interconnection technologies enabling wireless short-range communication between devices and smart objects. Range about 20 cm, 13.56 MHz band Enables 3 types of services:Service initiation and configurationP2P (peer to peer) data sharing and communicationPayment and ticketing Key enabler for the upcoming Internet of ThingsHow does Sentient Graffiti leverage from NFC?Touching interaction through NFCMIDP 2.0 Push Registry and NFC are combined to prevent users from starting mobile client before interacting with RFID augmented objectsProximity-aware interaction through NFCNokia NFC 6131 and Bluetooth SG servers are bound by simply touching an RFID tag with a mobile
  • 68. Available prototypes:Marker-associated Graffitis: Virtual Notice Board Public/private graffitis, expiration time, remote review, user participationBluetooth-range Graffitis: University Services BoothIndividual, group and private graffitis, tag-based (OPEN_DAY)Location-range Graffitis: Bus AlerterThird-party SG clientesOther possible applications:City Tour: Bilbao_tourism Graffiti DomainConference: AmI-07 feedback, expiration after conferencePublicity: Graffiti expiration after N timesFriend meetingsDisco/stadium/office blogsApplication Types & Examples
  • 72. Third-Party Mobile Application using Sentient Graffiti HTTP API
  • 73. ConclusionsSentient Graffiti is a platform which promotes a more extensive adoption of AmI in global environments(cities, cars, hospitals, homes) without imposing deployment and maintenance hassles, offering the following features:Context-aware to filter and select most appropriate smart objects’ content and services for usersEncourages the creation of third party context-aware mash-upsthrough its HTTP APIBased on standard web technologieslowering its adoption barrierEnables multi-modal interaction between users and environment through generic mobile clientFurther work:Evaluate SG in a mobile social software communityAdopt Semantic Web context modeling
  • 74. Iteration 4: Dealing with the heterogeneity, dynamic behaviour of existing instrumented environments, using available standardsSmartLab: Semantically Dynamic Infrastructure for Intelligent Environments
  • 75. Iteration 4: Dealing with the heterogeneity, dynamic behaviour of existing instrumented environmentsMiddleware support for intelligent environment provision:Monitoring contextDetermine the user activity and high level contextAdapt the environment to the user
  • 77. Layer 1: Sensors and Actuators
  • 78. Device assortment common in intelligent environments:EIB/KNXAsterisk VoIP VideoIP CamerasIndoor Location System (Ubisense)People wandering devices (Gerontek)Custom-built Devices (WSN)ChairDisplay braceletContainerEvery system has its own control interfaceHow do we interconnect all of them?Layer 1: Sensors and Actuators
  • 79. Layer 2: Service Abstraction
  • 80. Layer 2: Service AbstractionEvery device or system provides certain functionalities that we must transform into software services inside OSGi.Each device must provide a control bundle acting as a proxy inside the OSGi platform.All the native services of each device are wrapped in OSGi services.EIB/KNX Bus  BinaryLight, DimmableLight, Alarm, DoorSensorVideoIP HTTP Cameras CameraControllerVoiceIP Asterisk Server  AsteriskControllerGerontek Server  GerontekControllerUbisense COM Server  UbisenseControllerCustom-Built Devices  SmartChair, SmartContainer
  • 82. Layer 3: Service and Context Management
  • 83. Layer 3: Service ManagementDiscovery serviceSimple multicast protocol to obtain the bundles automatically.Installer serviceDecides whether the bundle should be installed or not.Local Repository serviceExtends the OBR service to provide a local cache for the discovered bundles.
  • 85. Layer 3: Context ManagementContext information modelled with an ontologyBase coreTime and space relationsEventsNew services might extend the knowledge baseClasses and instancesBehaviour rulesConverts inferred information into OSGi events to which the different services can register.React accordingly to specific events.
  • 88. Context ManagementTwo knowledge generation methods in SmartLab:Ontological reasoningMakes use of RDF (rdf:domain), RFS (rdfs:subPropertyOf) and OWL (owl:TransitiveProperty) predicatesAllows to infer implicit knowledgeRule-based reasoningAllows defining relationship among entities in ontologyThree types of inference:Semantic rules – enable making ontological reasoning based on RDF and OWL theoretical modelsKnowledge extraction rules – extract new knowledge from ontology’s implicit oneEvent-inferring rules – generate aggregated events from the context in the knowledge base
  • 89. Event-inferring Rule Example[EVENT-Meeting:(?meetingArea rdf:type <http://deusto.es/smartlab.owl#MeetingArea>),(?meetingArea <http://deusto.es/smartlab.owl#containsPersonItem> ?p1),(?meetingArea <http://deusto.es/smartlab.owl#containsPersonItem> ?p2), (?p1 <http://deusto.es/smartlab.owl#name> ?name1), (?p2 <http://deusto.es/smartlab.owl#name> ?name2),notEqual(?name1, ?name2),makeTemp(?meetingEvent)->(?meetingEvent rdf:type<http://deusto.es/smartlab.owl#LocableMeetingEvent>),(?meetingEvent <http://deusto.es/smartlab.owl#place> ?meetingArea), (?meetingEvent <http://deusto.es/smartlab.owl#name> ?name1),(?meetingEvent <http://deusto.es/smartlab.owl#name> ?name2) ]
  • 91. Layer 4: Service programmability, Management and Interaction
  • 92. Layer 4: Service programmability, Management and InteractionImplicit interactionContext management generates events and some services are invoked automatically.Explicit interactionHTTP interface inside OSGi to invoke any service that exposes remote methodsDashboard-like GUI based on widgets (javascript cross-browser library) that are loaded when the services are active.
  • 94. ConclusionsSeveral extensions to the OSGi framework to support intelligent and evolvable environment instrumentation have been presented:Devices or environment services expose a special semantic control bundle.OSGi bundles are discovered and act as a proxy providing semantic enhanced services.These services populate the system with new context information in order to infer new knowledge and generate events.Different services can register to receive context events and react to them accordingly.The platform knowledge has been modelled using ontologies and rules that can be extended and updated dynamically.For explicit interaction we have a HTTP interface or a Dashboard GUI based on widgets that can be used to interact with the platform.Semantic reasoning is powerful but costly computationally!!
  • 95. Iteration 5: Focus on a more specific application domain: AALZAINGUNE: Infrastructural Support for Ambient Assisted Living
  • 96. Some facts:By 2020, 25% of the EU's population will be over 65Spending on pensions, health and long-term care is expected to increase by 4-8% of GDP in coming decadesTotal expenditures tripling by 2050Older Europeans are important consumers with a wealth over €3000 billionAmbient Assisted Living (AAL) is a European Union initiative to address the needs of the ageing European populationElderly people should be able of living longer in their preferred environments, to enhance the quality of their livesCosts for society and public health systems should be reducedhttp://www.aal-europe.eu/To make AAL reality important to devise new easily-deployable middleware and hardware infrastructureIteration 5: Focus on a more specific application domain: AAL
  • 98. TecnológicoFundación DeustoTeknologikoa Deustu FundazioaAims to provide the software/hardware infrastructure (platform) required to easily deploy assistive services for elderly people @homehttp://www.tecnologico.deusto.es/projects/zainguneHOW?With an OSGi gateway powered by a rule-based reasoning engine which allows the coordination and cooperation of the home sensing and actuation devicesConsortium composed by:The ZAINGUNE Project
  • 99. Heterogeneous device support: Agotek’s gerontek, Asterisk IP phones, IP cameras, KNX-EIB devices, ...Model assistive environments as a set of cooperating servicesProgrammability through aSOA-based approach. Apply natural explicit interaction mechanisms:Easy to use gadget-based and secure front-end, phone-mediated interaction, ... ZAINGUNE Goals
  • 101. ZAINGUNE Multi-layered ArchitectureThe hardware device layer is composed of the sensors and actuators which populate an environmentThe software platform layer transforms the functionality provided by the devices managed by Layer 1 into software services (bundles) which can be combined to give place to more advanced servicesEvery device within an AAL environment (home, residence, hospital) is encapsulated as a bundle or execution unit within our OSGi environment.
  • 102. It includes two core bundles:
  • 103. ZainguneController – core component of ZAINGUNE server, manages and controls access to the components (OSGi bundles) supported by ZAINGUNE.
  • 104. ZainguneServlet – behaves as an Web Service/OSGi gateway exporting the OSGi bundle functionality through Web Services and generates web front-ends (based on JavaScript X library) of every bundleThe applications environment layer includes all the possible application scenarios for the ZAINGUNE infrastructurePublic housing flat for disabled or elderly people, hospitals, residences and so onMulti-layered Architecture
  • 105. Web gadget-based interaction – an easy to use web gadget-based environment controller divided into the following sections:Help – single button to request helpCommunications – call by photo, email and SMSHome control – control of every device by containerSurveillance – both local and remote IP camera controlPhone touchpad- and voice-based interaction – the integration of Asterisk in ZAINGUNE provides: feedback through phone speakers, house control through keystrokes and voice commands Alert bracelet-based interaction – special purpose device designed for assistance seeking and alert notificationMulti-modal Environment Interaction
  • 108. A custom-built device combining an organic screen (µOLED-96-G1 of 4D Systems) with a WSN mote based on Mica2DOT capable of displaying messages broadcasted by nearby motes.Every inhabitant may carry an alert bracelet for:Assistance seekingAlert notificationA future work option is to add living signal monitoring sensors (e.g. Nonin 4100 Avant Module) to such deviceZAINGUNE Alert Bracelet
  • 109. The adoption of a rule-based engine in ZAINGUNE offers two main advantages: Decouples environment configuration from programmabilityEnables environment-initiated proactive reactionsEnvironment intelligence is encapsulated as a set of rules which trigger when certain sensorial situations are matchedLHS represents sensing conditions whilst RHS depicts actions to be undertaken when the LHS situations are matchedThis rule-based paradigm is employed to configure the reactive behaviour of a ZAINGUNE-controlled environment: efficient management of energy resourcessecurity at home or danger situation preventionIntelligence through Rule-Based Reasoning
  • 110. rule "Flooding Event" no-loop true activation-group "eib" salience 10 whenevent: ZainguneEvent()houseInfo: ZainguneHouseInfo()eventSender: EventSender() eval(event.getTopic().equals("es/deusto/tecnologico/osgi/eventadmin/eib/VALUE_CHANGE"))eval(houseInfo.checkDeviceType((String)event.getProperty("name"), "FloodingSensor"))eval("On".equals((String)event.getProperty("newValue"))) then String topic = "es/deusto/tecnologico/zaingune/emergency/send";Hashtable<String, String> properties = new Hashtable<String, String>();ZainguneDeviceInfodeviceInfo = houseInfo.getDeviceInfoByName((String)event.getProperty("name"));ZainguneRoomInforoomInfo = houseInfo.getDevicesRoom(deviceInfo.getName());properties.put("location", roomInfo.getName());properties.put("emergency_type","flooding");properties.put("message", "Flooding in room " + roomInfo.getName()); Event outputEvent = createEvent(topic,properties); eventSender.sendEvent(outputEvent); retract(event);endRule-reasoning Example
  • 111. ZAINGUNE provides several easily-deployable ICT infrastructure contributions for their progressive adoption at elderly people’s homes Our main outcome is an OSGi platform powered by a rule-based reasoning engine which integrates a KNX/EIB automation bus, VoIP and VideoIP infrastructure to configure more aware and reactive homes. An assortment of multi-modal explicit interaction mechanisms to request services from the environment have been shown: Touch screen-based web gadget-based dashboardAn alert bracelet or VoIP phone-mediated interactionConclusions
  • 112. Review of currently active Research ActivitiesMobile Prosumer, Personal Mobile Sensing, Embedded Service Infrastructure, AAL Devices
  • 113. Prosumer Concept: mIO!mIO!: personal, ‘m’: mobile, ‘IO‘: input – output (consumer – producer)‘!’: immediatemIO! aims to develop technologies which help providing ubiquitous services within an intelligent environment adjusted to each user and his contextThe mobile will be used as an interface both with services offered by companies as well as micro-services created and provided on the move, directly by users themselvesURL: http://www.cenitmio.es100
  • 114. mIO!: SummaryAmbient Intelligence InteractionContext management and service discovery, taking into account user preferencesPersonalization: services adaptation and filteringNew Mobility User ExperienceNew interfaces, exploring access technologies to use devices on the mobile phone or near to it. Real and virtual information mixingUser Generated Mobile ServicesUser generated services “on the go” (Prosumer paradigm): light services and mashups created easily on the mobileMobile Open APIs: The mobile as main door to services which help the userReal Time Enterprise on the moveMobile Context Services for Enterprises/Governmental Org.Cities and Companies as intelligent environmentsCommunication and Connectivity TechnologiesProspection of the underlying communication and connectivity technologies
  • 115. Prosumer Concept: MUGGESMUGGES: Mobile User Generated Geo ServicesA new approach for exploiting innovative GNSS-based mobile LBS applications, personal, social or professional: the mobile prosumer concept.A new location model combining GNSS-based positioning and user-provided social positioning in order to support more significant location-based services.A new business model, with the “mobile as the service platform”, the “user adds value” and the “very long tail” as the three main pillars.A new GNSS-based application paradigm driving to a new service infrastructure and platform tools.URL: http://www.mugges-fp7.org
  • 116. Personal Smart Sensing Mobile Devices: PIRAmIDEPIRAmIDE aims to transform our mobile devices into a 6th sense which aids and mediates on our behalf easing and improving our daily interactions with everyday objects and locationsAn important aspect is to address the needs of visually impairedURL: http://www.piramidepse.com
  • 117. CBDP: Context-Based Digital PersonalityCreation of a context-based digital personally (DP) which acts as an enabling proxy between digital surroundings and the final user. DPs will benefit from mobile technologies for context-creation, maintenance and usage; and from semantic technologies for formal decisions and verifications. Usage of DP will simplify everyday interaction between users and their surrounding digital environments. URL: http://projects.celtic-initiative.org/cbdp/
  • 118. Service Infrastructure for Embedded Wireless Devices: ISMEDAims to provide the required software infrastructure to develop and deploy cooperative intelligent environments equipped by heterogeneous wireless embedded devicesAdopts Triple Space Computing for the communication/coordination/cooperation needs of the project
  • 119. Service Infrastructure for Embedded Wireless Devices: ISMED
  • 120. AAL DevicesThe goal is to contribute with several devices which can be easily integrated in an elderly person’s home:Set-top-box with event scheduler and monitor providing interface through TV and remoteDigital frame with enhanced communication, alert or interaction capabilities...
  • 121. ReferencesA Web 2.0 Platform to Enable Context-Aware Mobile Mash-ups, Diego López-de-Ipiña, Juan Ignacio Vazquez and Joseba Abaitua, Proceedings of AmI-07: European Conference on Ambient Intelligence, November 7-10, Darmstadt, Germany, B. Schiele et al. (Eds.): AmI 2007, LNCS 4794, pp. 266–286, 2007, ISSN 0302-9743, ISBN-10 3-540-76651-0 EMI2lets: a Reflective Framework for Enabling AmI, Diego López de Ipiña, Juan Ignacio Vázquez, Daniel García, Javier Fernández, Iván García, David Sainz and Aitor Almeida, Journal of Universal Computer Science (J.UCS), vol. 12, no. 3, pp. 297-314, March 2006TRIP: a Low-Cost Vision-Based Location System for Ubiquitous Computing. Diego López de Ipiña, Paulo Mendonça and Andy Hopper, Personal and Ubiquitous Computing journal, Springer, vol. 6, no. 3, pp. 206-219, May 2002.Visual Sensing and Middleware Support for Sentient Computing. Diego López de Ipiña, , PhD thesis, Cambridge University Engineering Department, January 2002Infrastructural Support for Ambient Assisted Living, Diego López-de-Ipiña, Xabier Laiseca, Ander Barbier, Unai Aguilera, Aitor Almeida, Pablo Orduña and Juan Ignacio Vazquez, Proceedings of 3rd Symposium of Ubiquitous Computing and Ambient Intelligence 2008, Advances in Soft Computing, vol. 51, Springer, ISSN: 1615-3871, ISBN: 978-3-540-85866-9, University of Salamanca, SPAIN, 22-24 October, 2008An Approach to Dynamic Knowledge Extension and Semantic Reasoning in Highly-Mutable Environments, Aitor Almeida, Diego López-de-Ipiña, Unai Aguilera, Iker Larizgoitia, Xabier Laiseca, Pablo Orduña and Ander Barbier, 3Proceedings of 3rd Symposium of Ubiquitous Computing and Ambient Intelligence 2008, Advances in Soft Computing, vol. 51, Springer, ISSN: 1615-3871, ISBN: 978-3-540-85866-9, , University of Salamanca, SPAIN, 22-24 October, 2008
  • 122. Dealing with the need for Infrastructural Support in Ambient Intelligence?Dr. Diego Lz-de-Ipiña Glz-de-ArtazaFaculty of Engineering (ESIDE), University of Deusto, Bilbaodipina@eside.deusto.eshttp://www.morelab.deusto.eshttp://www.smartlab.deusto.eshttp://paginaspersonales.deusto.es/dipina