Skip to main content
Engineering LibreTexts

2.2: What Does an Ontology Look Like?

  • Page ID
    6401
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Most of you may only vaguely have heard of ‘ontologies’, or not at all. Instead of delving into the theory straight away, we’ll have a quick look at the artefact, to show that, practically in computing and intelligent software development, it is an object one can play with and manipulate. The actual artefact can appear in multiple formats that are tailored to the intended user, but at the heart of it, there is a logic-based representation that the computer can process.

    African wildlife ontology (AWO) \(\PageIndex{1}\):

    Let us take as example the African Wildlife Ontology (AWO), which is a so-called ‘tutorial ontology’ that will return in the exercises. The AWO contains knowledge about wildlife, such as that giraffes eat leaves and twigs, that they are herbivores, that herbivores are animals, and so on. A mathematician may prefer to represent such knowledge with first order predicate logic. For instance:

    \[∀x(Lion(x) → ∀y(eats(x, y) ∧ Herbivore(y)) ∧ ∃z(eats(x, z) ∧ Impala))\]

    that states that “all lions eat herbivores, and they also eat some impalas”. This axiom may be one of the axioms in the ontology. One can represent the same knowledge also in logics other than plain vanilla first order logic. For instance, in a Description Logic language, we have the same knowledge formally represented as:

    \[Lion \sqsubseteq ∀eats.Herbivore \sqcap ∃eats.Impala\]

    A domain expert, however, typically will prefer a more user-friendly rendering, such as an automatically generated (pseudo-)natural language rendering, e.g.:

    Each lion eats only herbivore and eats some Impala

    where the first “\(\forall\)” in equation 1.1.1 is verbalized asEachand the second one as only, the “\(\wedge\)” as and, and the “\(\exists\)” as some. Another option is to use a graphical language that is more or less precise in showing the knowledge, as shown in Figure 1.1.1.

    Screenshot (47).png

    Figure 1.1.1: Two graphical rendering of lions eating only herbivores and at least some impala, with the OntoGraf plugin in the Protégé 4.x ontology development environment (A) and in UML class diagram style notation (B).

    Considering all those different renderings of the same knowledge, remember that an ontology is an engineering artefact that has to have a machine-processable format that faithfully adheres to the logic. None of these aforementioned representations are easily computer-processable, however. To this end, there are serialisations of the ontology into a text file that are easily computer-processable. The most widely-used one is the Web Ontology language OWL format. The required format is called the RDF/XML format, so then a machine-processable version of the class lion in the RDF/XML format looks as follows:

    machine-processable version of the class lion in rdf/xml format \(\PageIndex{2}\):

    <owl:Class rdf:about="&AWO;lion">
        <rdfs:subClassOf rdf:resource="&AWO;animal"/>
        <rdfs:subClassOf>
            <owl:Restriction>
                <owl:onProperty rdf:resource="&AWO;eats"/>
                <owl:someValuesFrom rdf:resource="&ontologies;AWO.owl#Impala"/
            </owl:Restriction>
        </rdfs:subClassOf>
        <rdfs:subClassOf>
            <owl:Restriction>
                <owl:onProperty rdf:resource="&AWO;eats"/>
                <owl:allValuesFrom rdf:resource="&AWO;herbivore"/>
            </owl:Restriction>
        </rdfs:subClassOf>
        <rdfs:comment>Lions are animals that eat only herbivores.</rdfs:comment>
    </owl:Class>

    where the “\(\forall\)” from equation 1.1.1 is serialised as owl:allValuesFrom, the “\(\exists\)” is serialised as owl:someValuesFrom, and the subclassing (“\(\to\)” and “\(\sqsubseteq\)” in Eqs 1.1.1 and 1.1.2, respectively) as rdfs:subClassOf.

    You typically will not have to write an ontology in this RDF/XML format. As a computer scientist, you may design tools that will have to process or modify such machine-processable ontology files, though even there, there are tool development toolkits and APIs that cover many tasks. For the authoring of an ontology, there are ontology development environments (ODEs) that render the ontology graphically, textually, or with a logic view. A screenshot of one such tool, Protégé, is included in Figure 1.2.1.


    This page titled 2.2: What Does an Ontology Look Like? is shared under a CC BY 4.0 license and was authored, remixed, and/or curated by Maria Keet via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.

    • Was this article helpful?