Actions

ConferenceCall 2023 11 08: Difference between revisions

Ontolog Forum

Line 21: Line 21:


== Agenda ==
== Agenda ==
* '''[[AnatolyLevenchuk|Anatoly Levenchuk]]''', strategist
* '''[[AnatolyLevenchuk|Anatoly Levenchuk]]''', strategist and blogger at [http://ailev.lievjournal.ru Laboratory Log]
** ''Knowledge graphs and large language models in cognitive architectures''
** ''Knowledge graphs and large language models in cognitive architectures''
** This talk discusses styles of definitions for knowledge graphs (KG) combined with large language models (LLMs). The KG architectures and systems are reviewed, taken from Ontolog Forum's 2020 Communique. A framework is proposed for a cognitive architecture using both LLMa and KGs for the evolution of knowledge during 4E (embodied, extended, embedded, enacted) cognition. In this framework, ontologies are understood as answers to the question "What is in the world?" and can be found in representations that vary across a spectrum of formality/rigor. An example is given of the use of ontology engineering training in management, where upper-level ontologies are given to students in the form of informal course texts (with the goal of obtaining a fine-tuned LLM within the "neural networks" of students' brains) coupled with lower-level ontologies that are more formal (such as data schemas for databases and knowledge graphs).  
** This talk discusses styles of definitions for knowledge graphs (KG) combined with large language models (LLMs). The KG architectures and systems are reviewed, taken from Ontolog Forum's 2020 Communique. A framework is proposed for a cognitive architecture using both LLMa and KGs for the evolution of knowledge during 4E (embodied, extended, embedded, enacted) cognition. In this framework, ontologies are understood as answers to the question "What is in the world?" and can be found in representations that vary across a spectrum of formality/rigor. An example is given of the use of ontology engineering training in management, where upper-level ontologies are given to students in the form of informal course texts (with the goal of obtaining a fine-tuned LLM within the "neural networks" of students' brains) coupled with lower-level ontologies that are more formal (such as data schemas for databases and knowledge graphs).  
<!-- ** [https://bit.ly/3sljmXt Slides] -->
<!-- ** [https://bit.ly/3sljmXt Slides] -->
* '''[[JohnSowa|John Sowa]]''' and '''[[ArunMajumdar|Arun Majumdar]]''', LLMs, ontologies, and formal systems
* '''[[ArunMajumdar|Arun Majumdar]]''' and '''[[JohnSowa|John Sowa]]''', [https://permion.ai/ Permion AI]
** ''Trustworthy Computation: Diagrammatic Reasoning With and About LLMs''
** Large Language Models (LLMs) were designed for machine translation (MT). Although LLM methods cannot do any reasoning by themselves, they can often find and apply reasoning patterns that they find in the vast resources of the WWW. For common problems, they frequently find a correct solution. For more complex problems, they may construct a solution that is partially correct for some applications, but disastrously wrong or even hallucinogenic for others. Systems developed by Permion use LLMs for what they do best. But they combine them with precise and trusted methods of diagrammatic reasoning based on conceptual graphs (CGs). They take advantage of the full range of technology developed by 60+ years of AI, computer science, and computational linguistics. For any application, Permion methods derive an ontology tailored to the policies, rules, and specifications of the project or business. All programs and results they produce are guaranteed to be consistent with that ontology.


== Conference Call Information ==
== Conference Call Information ==

Revision as of 19:18, 6 November 2023

Session Broader thoughts
Duration 1 hour
Date/Time 8 Nov 2023 17:00 GMT
9:00am PST/12:00pm EST
5:00pm GMT/6:00pm CET
Convener Andrea Westerinen and Mike Bennett

Ontology Summit 2024 Broader thoughts

Agenda

  • Anatoly Levenchuk, strategist and blogger at Laboratory Log
    • Knowledge graphs and large language models in cognitive architectures
    • This talk discusses styles of definitions for knowledge graphs (KG) combined with large language models (LLMs). The KG architectures and systems are reviewed, taken from Ontolog Forum's 2020 Communique. A framework is proposed for a cognitive architecture using both LLMa and KGs for the evolution of knowledge during 4E (embodied, extended, embedded, enacted) cognition. In this framework, ontologies are understood as answers to the question "What is in the world?" and can be found in representations that vary across a spectrum of formality/rigor. An example is given of the use of ontology engineering training in management, where upper-level ontologies are given to students in the form of informal course texts (with the goal of obtaining a fine-tuned LLM within the "neural networks" of students' brains) coupled with lower-level ontologies that are more formal (such as data schemas for databases and knowledge graphs).
  • Arun Majumdar and John Sowa, Permion AI
    • Trustworthy Computation: Diagrammatic Reasoning With and About LLMs
    • Large Language Models (LLMs) were designed for machine translation (MT). Although LLM methods cannot do any reasoning by themselves, they can often find and apply reasoning patterns that they find in the vast resources of the WWW. For common problems, they frequently find a correct solution. For more complex problems, they may construct a solution that is partially correct for some applications, but disastrously wrong or even hallucinogenic for others. Systems developed by Permion use LLMs for what they do best. But they combine them with precise and trusted methods of diagrammatic reasoning based on conceptual graphs (CGs). They take advantage of the full range of technology developed by 60+ years of AI, computer science, and computational linguistics. For any application, Permion methods derive an ontology tailored to the policies, rules, and specifications of the project or business. All programs and results they produce are guaranteed to be consistent with that ontology.

Conference Call Information

  • Date: Wednesday, 8 November 2023
  • Start Time: 9:00am PST / 12:00pm EST / 6:00pm CET / 5:00pm GMT / 1700 UTC
    • Note that Daylight Saving Time has ended in Europe, Canada and the US.
    • ref: World Clock
  • Expected Call Duration: 1 hour
  • Video Conference URL: https://bit.ly/48lM0Ik
    • Conference ID: 876 3045 3240
    • Passcode: 464312

The unabbreviated URL is: https://us02web.zoom.us/j/87630453240?pwd=YVYvZHRpelVqSkM5QlJ4aGJrbmZzQT09

Participants

Discussion

Resources

Previous Meetings

 Session
ConferenceCall 2023 11 01Demos of information extraction via hybrid systems
ConferenceCall 2023 10 25A look across the industry, Part 2
ConferenceCall 2023 10 18A look across the industry, Part 1
... further results

Next Meetings

 Session
ConferenceCall 2023 11 15Synthesis
ConferenceCall 2024 02 21Overview
ConferenceCall 2024 02 28Foundations and Architectures
... further results