From OntologPSMW

Revision as of 11:11, 29 January 2015 by KennethBaclawski (Talk | contribs)

Jump to: navigation, search
[ ]


OntologySummit2015 Track B Session - Thu 2015-01-29     (1)

Semantic technologies and ontologies, such as the Semantic Sensory Network (SSN) ontology and associated reasoning, play a major role in the IoT because they are increasingly being applied to help intelligently process and understand sensor information. This supports the overall IoT vision of various, ubiquitous & increasingly small sensors and actuators embedded in physical object devices providing information about the environment and systemic interpretation of collected data via linked wired and wireless networks to the Internet. This information augments our awareness of a sensed environments and, thus, serves as an IoT bridge between physical and digital realms. The SSN ontology, soon an OGC standard, has been a source of good work useful for starting work relevant to IoT. A good deal of work and publications have grown around the ontology and some lessons were learned along the way. A sensor network focus allows discussion of some these topics along with major challenges in utilizing semantic technologies for the IoT.     (1C)

The track will discuss the major challenges in utilizing semantic technologies for the IoT, and current efforts on developing the next generation of semantic technology for integrating, processing and understanding sensor information and for developing smart sensor networks.     (1D)

Working with various researchers we will discuss the use of the SSNO and its various extensions, look at some of the ontology problem space and explore possible solutions.     (1E)

Gary Berg-Cross (SOCoP)     (1G)

Overview of the "Beyond SSNO" track topic     (1H)

Presenters     (1I)

  • Jeff Voas (NIST)– Networks of Things: Pieces, Parts and Data (Sensors Model for IoT)     (1I1)
  • Cory Henson: Semantic Sensor Network Ontology: past, present, and future     (1I2)

Abstract - In the summer of 2011, the Semantic Sensor Network Ontology (SSNO) was born from the sudden and rapid emergence of sensor data on the Web. The SSNO provides a common language to enable more effective interoperation, integration, and discovery of Web-based sensing systems and the data they produce. Over the previous three years the SSNO has been used in a wide range of applications; and with this experience the SSNO is now set to become an OGC/W3C standard. The emergence of the IoT, however, presents new challenges and requirements that, perhaps, were not fully appreciated in the summer of 2011. How will this new development effect the semantics of sensing going forward? Stay tuned.     (1I3)

  • Dial-in:     (1J4)
    • Phone (US): +1 (425) 440-5100 ... (long distance cost may apply)     (1J4A)
    • Skype: join.conference (i.e. make a skype call to the contact with skypeID="join.conference") ... (generally free-of-charge, when connecting from your computer ... ref.)     (1J4B)
      • when prompted enter Conference ID: 843758#     (1J4B1)
      • Unfamiliar with how to do this on Skype? ...     (1J4B2)
        • Add the contact "join.conference" to your skype contact list first. To participate in the teleconference, make a skype call to "join.conference", then open the dial pad (see platform-specific instructions below) and enter the Conference ID: 843758# when prompted.     (1J4B2A)
      • Can't find Skype Dial pad? ...     (1J4B3)
        • for Windows Skype users: Can't find Skype Dial pad? ... it's under the "Call" dropdown menu as "Show Dial pad"     (1J4B3A)
        • for Linux Skype users: please note that the dial-pad is only available on v4.1 (or later; or on the earlier Skype versions 2.x,) if the dialpad button is not shown in the call window you need to press the "d" hotkey to enable it. ... (ref.)     (1J4B3B)
    • instructions: once you got access to the page, click on the "settings" button, and identify yourself (by modifying the Name field from "anonymous" to your real name, like "JaneDoe").     (1J5A)
    • You can indicate that you want to ask a question verbally by clicking on the "hand" button, and wait for the moderator to call on you; or, type and send your question into the chat window at the bottom of the screen.     (1J5B)
    • thanks to the folks, one can now use a jabber/xmpp client (e.g. gtalk) to join this chatroom. Just add the room as a buddy - (in our case here) ... Handy for mobile devices!     (1J5C)
  • Discussions and Q & A:     (1J6)
    • Nominally, when a presentation is in progress, the moderator will mute everyone, except for the speaker.     (1J6A)
    • To un-mute, press "*7" ... To mute, press "*6" (please mute your phone, especially if you are in a noisy surrounding, or if you are introducing noise, echoes, etc. into the conference line.)     (1J6B)
    • we will usually save all questions and discussions till after all presentations are through. You are encouraged to jot down questions onto the chat-area in the mean time (that way, they get documented; and you might even get some answers in the interim, through the chat.)     (1J6C)
    • During the Q&A / discussion segment (when everyone is muted), If you want to speak or have questions or remarks to make, please raise your hand (virtually) by clicking on the "hand button" (lower right) on the chat session page. You may speak when acknowledged by the session moderator (again, press "*7" on your phone to un-mute). Test your voice and introduce yourself first before proceeding with your remarks, please. (Please remember to click on the "hand button" again (to lower your hand) and press "*6" on your phone to mute yourself after you are done speaking.)     (1J6D)
  • RSVP to with your affiliation appreciated, ... or simply just by adding yourself to the "Expected Attendee" list below (if you are a member of the community already.)     (1J8)
  • Please note that this session may be recorded, and if so, the audio archive is expected to be made available as open content, along with the proceedings of the call to our community membership and the public at-large under our prevailing open IPR policy.     (1J10)

Attendees     (1K)