Ontologies contain rich knowledge within domain, which can be divided into two categories, namely extensional knowledge and intensional knowledge. Extensional knowledge provides information about the concrete instances that belong to specific concepts in the ontology, while intensional knowledge details inherent properties, characteristics, and semantic associations among concepts. However, existing ontology embedding approaches fail to take both extensional knowledge and intensional knowledge into fine consideration simultaneously. In this paper, we propose a novel ontology embedding approach named EIKE (Extensional and Intensional Knowledge Embedding) by representing ontologies in two spaces, called extensional space and intensional space. EIKE presents a unified framework for embedding instances, concepts and their relations in an ontology, applying a geometry-based method to model extensional knowledge and a pretrained language model to model intensional knowledge, which can capture both structure information and textual information. Experimental results show that EIKE significantly outperforms state-of-the-art methods in three datasets for both triple classification and link prediction, indicating that EIKE provides a more comprehensive and representative perspective of the domain.

The Importance of Ontology Embedding in Knowledge Representation

Introduction: In the field of knowledge representation, ontologies play a crucial role in capturing and organizing domain-specific knowledge. They provide a formal framework for representing concepts, relationships, and properties within a given domain. However, existing ontology embedding approaches have limitations when it comes to effectively capturing both extensional and intensional knowledge.

The Two Categories of Knowledge in Ontologies

Extensional knowledge refers to specific instances or examples that belong to a particular concept within an ontology. This type of knowledge provides concrete information about the instances themselves and their relationships to other concepts. On the other hand, intensional knowledge focuses on the inherent properties, characteristics, and semantic associations among concepts, offering a more abstract and conceptual understanding of the domain.

The Limitations of Existing Approaches: Traditional ontology embedding methods have not fully addressed the integration of extensional and intensional knowledge simultaneously. By failing to capture both types of knowledge, these approaches may overlook important connections between concepts and limit the overall representational power of ontologies.

A Novel Approach: EIKE (Extensional and Intensional Knowledge Embedding)

In this paper, a groundbreaking approach called EIKE (Extensional and Intensional Knowledge Embedding) is proposed. EIKE addresses the limitations of existing methods by representing ontologies in two distinct spaces – the extensional space and the intensional space.

The Extensional Space: EIKE incorporates a geometry-based modeling technique to represent extensional knowledge within an ontology. This approach enables the capturing of spatial relationships between instances, concepts, and their connections. By leveraging geometric modeling, EIKE can better capture structural information and interconnections within the knowledge domain.

The Intensional Space: Intensional knowledge, on the other hand, requires a language-based modeling approach to effectively capture the textual information associated with concepts. Consequently, EIKE utilizes a pretrained language model to represent intensional knowledge. By leveraging the semantic understanding and contextual representation provided by pretrained language models, EIKE can capture the subtle nuances and semantic associations among concepts.

Advantages and Experimental Results

Comprehensive and Representative Perspective: EIKE’s innovative approach of integrating extensional and intensional knowledge embedding provides a more comprehensive and representative perspective of the knowledge domain. By capturing both concrete and conceptual aspects of ontologies, EIKE offers a holistic representation that enables a deeper understanding of the domain in question.

Outperforming State-of-the-art Methods: Experimental results demonstrate that EIKE significantly outperforms existing methods in three datasets for triple classification and link prediction tasks. This superiority further highlights the effectiveness of EIKE in capturing the intricacies of extensional and intensional knowledge, thus enriching knowledge representation within ontologies.

Conclusion

EIKE presents a multi-disciplinary approach that combines geometric modeling and pretrained language models to embed both extensional and intensional knowledge within ontologies. The integration of both types of knowledge significantly enhances the representational power and comprehensiveness of ontologies. The experimental results validate the superiority of EIKE over existing methods, making it a valuable contribution to ontology embedding research.

Read the original article