Ontolog Forum
Duration | 1.5 hour |
---|---|
Date/Time | 31 Aug 2022 16:00 GMT |
9:00am PDT/12:00pm EDT | |
5:00pm BST/6:00pm CEST | |
Convener | Ken Baclawski |
Neurosymbolic Computation with the Logic Virtual Machine
- Arun K. Majumdar and John F. Sowa Slides Video Recording
The Logic Virtual Machine (LVM) supports ISO Standard Prolog with tensors as a native datatype. The tensors can represent graphs or networks of any kind. For most applications, the great majority of elements of the tensors are zero, and the nonzero elements are bit strings of arbitrary length. Those bit strings may encode symbolic information, floating-point numbers. or integers of any size.
If the nodes encode symbolic information, the tensors may represent conceptual graphs, knowledge graphs, or the logics of the Semantic Web, Formal UML, and other knowledge representation languages. As an extension of Prolog, LVM can perform logical reasoning or other kinds of transformations on or with those notations.
Since the bit strings in the tensors may be interpreted arbitrary numbers, operations on those tensors may perform the same subsymbolic computations used in neural networks. As an extension of Prolog, LVM can relate neural tensors and symbolic tensors for neurosymbolic reasoning.
Since the symbolic and subsymbolic tensors are supported by the same LVM system, operations that relate them can be performed with a single Prolog statement. For special-purpose operations and connections to other systems, LVM can invoke subroutines in Python or C.
By combining the strengths of symbolic AI with neural networks, LVM spans the full range learning and reasoning methods developed for either or both. Three major applications are computational chemistry, sentiment analysis in natural language documents, and fraud analysis in financial transactions.
All three applications take advantage of the unique LVM features: precise symbolic reasoning, neural-network style of learning, and the ability to analyze huge volumes of graphs by precise methods or probabilistic methods. Applications in computational chemistry have obtained excellent results in molecular toxicology for the EPA TOX21 challenge. Applications in sentiment analysis combine Prolog's advantages for natural language processing (NLP) with a neural-network style of learning. Applications in financial fraud analysis combine NLP methods and learning methods that must be accurate to fractions of a cent.
Agenda
- 12:00 - 12:30 John Sowa
- 12:30 - 13:00 Arun Majumdar
- 13:00 - 13:30 Discussion
Conference Call Information
- Date: Wednesday, 31 Aug 2022
- Start Time: 9:00am PDT / 12:00pm EDT / 6:00pm CEST / 5:00pm BST / 1600 UTC
- ref: World Clock
- Expected Call Duration: 1.5 hours
- The Video Conference URL is https://bit.ly/3c7arB1
- Meeting ID: 889 3493 8136
- Passcode: 030714
- Chat room: https://bit.ly/3A9cjRE
Attendees
- Adrian Walker
- Alex Shkotin
- Ali Hashemi
- Arun Majumdar
- Bill McCarthy
- Bobbin Teegarden
- Brandon Whitehead
- Chris Ahern
- Daniel Schwabe
- David Eddy
- Doug Holmes
- Gary Berg-Cross
- Geoff Campbell
- Jack Park
- James Davenport
- Janet Singer
- Jim Logan
- John Sowa
- Ken Baclawski
- Michael Singer
- Mike Bennett
- Nancy Wiegand
- Pat Cassidy
- Paul Tyson
- Ram D. Sriram
- Ravi Sharma
- Robert Rovetto
Discussion
[12:23] Mike Bennett: Question (Slide 11): Given the higher expressivity of CG compared to OWL, would that mean that mapping these original sentences to the ontology was more straight forward than it would be with a 1st order or CL based ontology?
[12:26] Geoff Campbell: Thanks for the pointer, Mike
[12:33] David Eddy: One history starting point... IBM's (in?)famous machine translation in 1954. Scroll down a smidge on the image. https://timesmachine.nytimes.com/timesmachine/1954/01/08/issue.html
[12:35] RaviSharma: John: In that case slide 22 would imply that memory recall is an important cognitive function?
[12:43] RaviSharma: John: There is a study published that practicing and speaking / chanting improves these lobes over the years.
[12:44] janet singer: Slide 28 + is important for extending the notion of neural from a simplified connectionist model to use of complex insights from actual neuroscience
[12:46] Mike Bennett: @Janet +1 - it seems that an AI would need a frontal lobe analog i.e. notions of self/entity and goals, to address pragmatics.
[12:49] janet singer: Similarly John is emphasizing that for pragmatics cannot be left out in favor of syntax and semantics only. How did it come about that pragmatics is so often overlooked (e.g. in the semantic web)?
[12:49] Mike Bennett: As a minimum we need to be able to recreate the 'frontal lobe' of an organization for any business-useful semantic application or AI.
[12:50] Mike Bennett: Right. You can't interpret human language without the model of a human that humans carry around in the FL.
[12:52] RaviSharma: Ken - will Arun's slides be available as well? Ken Baclawski: @[12:52] RaviSharma: I will ask Arun for his slides after the meeting.
[12:53] Mike Bennett: We need a unified framework of semantics, syntax, pragmatics and epistemology.
[12:54] janet singer: So for a better understanding of the full human language (semiotic) picture, is it misleading to talk so much about semantic modeling as the goal? Should one lead more with pragmatics, or doing things with signs?
[12:56] RaviSharma: From Jack Park: https://www.amazon.com/Artificial-Chemistries-Press-Wolfgang-Banzhaf/dp/026202943X
[12:57] Mike Bennett: @Janet +1
[12:58] janet singer: Mike, +1 Missed your comment. Definitely using C. Morris' treatment of semiotics as the theory of signs that spans those four seems like it would be feasible at this point.
[13:01] janet singer: Charles W. Morris. Foundations of the theory of signs. International encyclopedia of unified science, vol. 1, no. 2. The University of Chicago Press, Chicago1938 https://pure.mpg.de/rest/items/item_2364493_2/component/file_2364492/content
[13:01] park: https://www.amazon.com/Artificial-Chemistries-Press-Wolfgang-Banzhaf/dp/026202943X
[13:03] Mike Bennett: Immersive cyberspace! Driven by semantics.
[13:06] RaviSharma: Arun - What you showed is training-set based or ground truth based ML to get classification or recognition, but what were different parameters for toxicity?
[13:06] janet singer: Yes, this is very impressive.
[13:11] RaviSharma: Arun: have you extended Prolog or plugged in solvers using Prolog?
[13:12] RaviSharma: Arun: as shown in intro, where are steps beyond ML?
[13:13] Daniel Schwabe: How is negation handled in this new engine?
[13:15] janet singer: Arun, How many other people working on QNLP, and is there any other approach that has comparable promise?
[13:16] park: https://www.neasqc.eu/use-case/quantum-natural-language-processing-qnlp/
[13:19] janet singer: Jack, thanks. And any other approach with comparable promise? (rhetorical question)
[13:21] Mike Bennett: Virtual Pragmatics?
[13:21] janet singer: Yes!! you need to represent the pragmatics
[13:31] park: https://en.wikipedia.org/wiki/Pandemonium_architecture
[13:34] park: https://en.wikipedia.org/wiki/Society_of_Mind
[13:34] park: http://www.acad.bg/ebook/ml/Society%20of%20Mind.pdf
[13:37] janet singer: 4E Cognition: Embodied, embedded, enacted, extended mind. 5E adds ecological, 6E adds evolutional
[13:38] janet singer: Other candidates are emotional, exoconscious, ethical,
[13:41] janet singer: Evaluative
[13:41] janet singer: 10E
[13:49] janet singer: Multi-E cognition helps for addressing representation of pragmatics (value-driven or goal-oriented social action with signs)
[13:55] janet singer: As John suggested at the Ontology Summit a few years ago, one arguably needs a meta-level resource for reconciling ontologies rather than attempting to integrate TLOs on the object level
[13:59] janet singer: John mentions CYC for reconciling micro-theories as views
[14:00] janet singer: Ken says people may agree on the meanings of terms but not the relationships between the terms.