Actions

Ontolog Forum

Revision as of 17:48, 8 February 2015 by imported>KennethBaclawski (→‎Presentation Material)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

OntologySummit2015 Track A Session - Ontology Integration in the Internet of Things - Thu 2015-02-05

Session Co-chairs: RamSriram & LeoObrst

Billions of things will be connected to the Internet. These things span a spectrum of cognitive abilities from simple sensors to humans. Ontologies will play a significant role in integrating these things at different abstraction levels. The goal of track A (Ontology Integration in the Internet of Things) is: To discuss the various approaches being taken to address the integration and interoperability issues. We intend to present case studies of IoT, discuss current approaches in integration and interoperability, discuss gaps in current approaches, and discuss issues of vertical integration and interoperability across layers of the IoT, including granularity. We also want to propose methods for achieving integration and interoperability through ontologies, and propose a unified framework for integration and interoperability for multimodal (audio, text, video, etc.) interfaces.

Agenda and Presenters

  • Agenda
    • Overview of Track A: Ontology Integration in the IoT - Dr. LeoObrst (MITRE), Dr. RamSriram (NIST)
    • An Ontology-Driven Integration Framework for Smart Communities - Dr. SteveRay (Carnegie Mellon University)
      • Abstract: This presentation describes our work concerning the definition of a neutral, abstract ontology and framework that supports the vision and diverse contexts of a smart community. This framework is composed of a general, core ontology that supports what many are calling the Internet of Things, a scalable number of extension ontologies to describe various application perspectives, and a mapping methodology to relate external data and/or schemas to our ontology. Finally, we show why this ontology is robust, scalable and generic enough to support a wide range of smart devices, systems and people.
    • Dynamic Semantics for the Internet of Things - Dr. PayamBarnaghi (University of Surrey, UK)
      • Abstract: The rapid increase in the number of network-enabled devices and sensors deployed in physical environments is changing information communication networks and services and applications in various domains. It is predicted that within the next decade billions of devices will generate large volumes of real world data for many applications and services in a variety of areas such as smart grids, smart homes, healthcare, automotive, transport, logistics and environmental monitoring. The technologies and solutions that enable integration of real world data and services into current information networking technologies are often described under the umbrella term of the Internet of Things (IoT).
      • When dealing with large volumes of distributed and heterogeneous IoT data, issues related to interoperability, automation, and data analytics will require common description and data representation frameworks and machine-readable and machine-interpretable data descriptions. IoT data is heterogeneous, multi-modal and can be of variable quality and is often streamed. Interoperability is a key requirement to support large-scale IoT deployments and multi-provider systems. However, dynamicity, diversity and resource constraints of the IoT environment can hinder using semantic technologies in the way they are used on the web. This talk will provide an overview of the use-case and requirements for semantic interoperability in the IoT with a focus on annotation, processing and information extraction and dynamicity in the IoT environment. Some of the recent and on-going research and development in this domain will be also discussed.
    • Semantic Integration Prototype for Wearable Devices in Health Care - Dr. JackHodges (Web of Things (WOT) Research Group, Siemens Berkeley Laboratory)
      • Abstract: Semantic technologies and ontologies will be useful in application contexts when they can be integrated with information sources and existing application model formats. Often this requires an approach of using some semantic information in lightweight processing models and accessing richer semantic models when needed, but based on rich and common underlying ontologies. In many application contexts there exist curated domain models which can be leveraged if they can be integrated. In this talk a prototype of one such scenario will be presented – the use of curated biomedical ontologies to assist health care professionals in selecting appropriate wearable devices to monitor diagnosed disorders. This prototype sought to provide access to and search across five kinds of information in mostly curated ontologies: wearable devices, quantities, diseases, symptoms, and anatomical parts. The choices made and issues confronted will be discussed.

Presentation Material

Conference Call Details

  • Date: Thursday, 05-Feb-2015
  • Start Time: 9:30am PST / 12:30pm EST / 6:30pm CEST / 5:30pm BST / 1730 UTC
  • Expected Call Duration: ~1.5 hours
  • Dial-in:
    • Phone (US): +1 (425) 440-5100 ... (long distance cost may apply)
      • ... [ backup nbr: (315) 401-3279 ]
      • when prompted enter Conference ID: 843758#
    • Skype: join.conference (i.e. make a skype call to the contact with skypeID="join.conference") ... (generally free-of-charge, when connecting from your computer ... ref.)
      • when prompted enter Conference ID: 843758#
      • Unfamiliar with how to do this on Skype? ...
        • Add the contact "join.conference" to your skype contact list first. To participate in the teleconference, make a skype call to "join.conference", then open the dial pad (see platform-specific instructions below) and enter the Conference ID: 843758# when prompted.
      • Can't find Skype Dial pad? ...
        • for Windows Skype users: Can't find Skype Dial pad? ... it's under the "Call" dropdown menu as "Show Dial pad"
        • for Linux Skype users: please note that the dial-pad is only available on v4.1 (or later; or on the earlier Skype versions 2.x,) if the dialpad button is not shown in the call window you need to press the "d" hotkey to enable it. ... (ref.)
  • In-session chat-room url: http://webconf.soaphub.org/conf/room/summit_201500205
    • instructions: once you got access to the page, click on the "settings" button, and identify yourself (by modifying the Name field from "anonymous" to your real name, like "JaneDoe").
    • You can indicate that you want to ask a question verbally by clicking on the "hand" button, and wait for the moderator to call on you; or, type and send your question into the chat window at the bottom of the screen.
    • thanks to the soaphub.org folks, one can now use a jabber/xmpp client (e.g. gtalk) to join this chatroom. Just add the room as a buddy - (in our case here) summit_20150205@soaphub.org ... Handy for mobile devices!
  • Discussions and Q & A:
    • Nominally, when a presentation is in progress, the moderator will mute everyone, except for the speaker.
    • To un-mute, press "*7" ... To mute, press "*6" (please mute your phone, especially if you are in a noisy surrounding, or if you are introducing noise, echoes, etc. into the conference line.)
    • we will usually save all questions and discussions till after all presentations are through. You are encouraged to jot down questions onto the chat-area in the mean time (that way, they get documented; and you might even get some answers in the interim, through the chat.)
    • During the Q&A / discussion segment (when everyone is muted), If you want to speak or have questions or remarks to make, please raise your hand (virtually) by clicking on the "hand button" (lower right) on the chat session page. You may speak when acknowledged by the session moderator (again, press "*7" on your phone to un-mute). Test your voice and introduce yourself first before proceeding with your remarks, please. (Please remember to click on the "hand button" again (to lower your hand) and press "*6" on your phone to mute yourself after you are done speaking.)
  • RSVP to ram.sriram@nist.gov with your affiliation appreciated, ... or simply just by adding yourself to the "Expected Attendee" list below (if you are a member of the community already.)
  • Please note that this session may be recorded, and if so, the audio archive is expected to be made available as open content, along with the proceedings of the call to our community membership and the public at-large under our prevailing open IPR policy.

Attendees