A Cognitive Reference based Model for Learning Compositional Hierarchies with Whole-composite Tags

Anshuman Saxena, Ashish Bindal, Alain Wegmann

Abstract

A compositional hierarchy is the default organization of knowledge acquired for the purpose of specifying the design requirements of a service. Existing methods for learning compositional hierarchies from natural language text, interpret composition as an exclusively propositional form of part-whole relations. Nevertheless, the lexico-syntactic patterns used to identify the occurrence of part-whole relations fail to decode the experientially grounded information, which is very often embedded in various acts of natural language expression, e.g. construction and delivery. The basic idea is to take a situated view of conceptualization and model composition as the cognitive act of invoking one category to refer to another. Mutually interdependent set of categories are considered conceptually inseparable and assigned an independent level of abstraction in the hierarchy. Presence of such levels in the compositional hierarchy highlight the need to model these categories as a unified-whole wherein they can only be characterized in the context of the behavior of the set as a whole. We adopt an object-oriented representation approach that models categories as entities and relations as cognitive references inferred from syntactic dependencies. The resulting digraph is then analyzed for cyclic references, which are resolved by introducing an additional level of abstraction for each cycle.

References

  1. Ashby, W. R. (1964). Introduction to Cybernetics, Methuen.
  2. Barsalou, L. (2009). "Simulation, situated conceptualization, and prediction." Philosophical Transactions of the Royal Society B: Biological Sciences 364(1521): 1281-1289.
  3. Barsalou, L. W. (2003). "Situated simulation in the human conceptual system." Language and Cognitive Processes 18(5): 513-562.
  4. Barwise, J. and J. Perry (1983). Situations and attitudes. Cambridge, MA, MIT Press.
  5. Bunge, M. (2004). Emergence and Convergence: Qualitative Novelty and the Unity of Knowledge, University of Toronto.
  6. De Marnee, M.-C. and C. Manning (2011). Stanford typed dependencies manual.
  7. de Marneffe, M. C., B. Maccartney, et al. (2006). Generating Typed Dependency Parses from Phrase Structure Parses. LREC.
  8. Dorr, B. J. (1993). Machine Translation: A view from the lexicon. Cambridge, Massachusetts, The MIT Press.
  9. Dowty, D. (1991). "Thematic Proto-Roles and Argument Selection." Language 67(3): 547-619.
  10. Fauconnier, G. (1994). Mental Spaces: Aspects of Meaning Construction in Natural Language, {Cambridge University Press}.
  11. Fillmore, C. J. (1968). The Case for Case. Universals in Linguistic Theory. E. H. Bach, Robert. New York, Holt, Rinehart, and Winston.
  12. Girju, R. and D. I. Moldovan (2002). Text Mining for Causal Relations. Proceedings of the Fifteenth International Florida Artificial Intelligence Research Society Conference, AAAI Press: 360-364.
  13. Group, S. N. L. P. (2012). "The Stanford parser: A statistical parser." Retrieved October 07, 2012, from http://nlp.stanford.edu/software/lex-parser.shtml.
  14. Hand, D. J., P. Smyth, et al. (2001). Principles of data mining, MIT Press.
  15. Harris, Z. (1968). Mathematical structures of language, Interscience Publishers.
  16. Hearst, M. A. (1992). Automatic acquisition of hyponyms from large text corpora. Proceedings of the 14th conference on Computational linguistics - Volume 2. Nantes, France, Association for Computational Linguistics: 539-545.
  17. Jackendoff, R. (1987). "The Status of Thematic Relations in Linguistic Theory." Linguistic Inquiry 18(3): 369- 411.
  18. Khoo, C. S. G., S. Chan, et al. (2000). Extracting causal knowledge from a medical database using graphical patterns. Proceedings of the 38th Annual Meeting on Association for Computational Linguistics. Hong Kong, Association for Computational Linguistics: 336-343.
  19. Lakoff, G. and M. Johnson (2003). Metaphors We Live By, University Of Chicago Press.
  20. Langacker, R. (1987). Foundations of Cognitive Grammar. Vol. 1: Theoretical Prerequisites. Stanford, Stanford University Press.
  21. Langacker Ronald, W. (1993). Reference-point constructions. Cognitive Linguistics (includes Cognitive Linguistic Bibliography). 4: 1.
  22. Langacker, R. W. (1994). "Structural Syntax: The View from Cognitive Grammar." Semiotique 6/7: 69-84.
  23. Langacker, R. W. (2008). Cognitive Grammar: A Basic Introduction. New York, Oxford University Press.
  24. Miller, G. A. (1990). "Nouns in WordNet: A Lexical Inheritance System." International Journal of Lexicography 3(4): 245-264.
  25. Narrog, H. (2005). "On defining modality again." Language Sciences 27(2): 165-192.
  26. Nie, J.-Y. (2003). "Query expansion and query translation as logical inference." J. Am. Soc. Inf. Sci. Technol. 54(4): 335-346.
  27. Nivre, J. (2005). Dependency Grammar and Dependency Parsing.
  28. Rosch, E. (1975). "Cognitive reference points." Cognitive Psychology 7(4): 532-547.
  29. Saint-Dizier, P. (2006). Introduction to the Syntax and Semantics of Prepositions
  30. Syntax and Semantics of Prepositions. P. Saint-Dizier, Springer Netherlands. 29: 1-25.
  31. Saxena, A. B. and A. Wegmann (2012). From Composites to Service Systems: The Role of Emergence in Service Design. IEEE International Conference on Systems, Man, and Cybernetics. Seoul, Korea.
  32. Sedgewick, R. (2011). Algorithms. Boston, MA, AddisonWesley.
  33. Simon, H. A. (1962). "The architecture of complexity." Proceedings of the American Philosophical Society 106(6): 467-482.
  34. Sowa, J. F. (1984). Conceptual structures: Information processing in mind and machine. Reading, MA, Addison-Wesley.
  35. Talmy, L. (1988). "Force Dynamics in Language and Cognition." Cognitive Science 12(1): 49-100.
  36. Tarjan, R. (1972). "Depth-First Search and Linear Graph Algorithms." SIAM Journal on Computing 1(2): 146- 160.
  37. Tesniere, L. (1959). Elements de syntaxe structurale. Paris, Klincksieck.
  38. Tribushinina, E. (2008). Cognitive reference points: semantics beyond the prototypes in adjectives of space and colour. Doctoral Thesis, Leiden University.
  39. Wilson, D. and D. Sperber (1993). "Linguistic form and relevance." Lingua 90(1-2): 1-25.
  40. Winston, M., R. Chaffin, et al. (1987). "A Taxonomy of Part-Whole Relations." Cognitive Science 11(4): 417- 444.
  41. Zarri, G. (1997). Conceptual modelling of the “meaning” of textual narrative documents. Foundations of Intelligent Systems. Z. Ras and A. Skowron, Springer Berlin / Heidelberg. 1325: 550-559.
Download


Paper Citation


in Harvard Style

Saxena A., Bindal A. and Wegmann A. (2013). A Cognitive Reference based Model for Learning Compositional Hierarchies with Whole-composite Tags . In Proceedings of the International Conference on Knowledge Discovery and Information Retrieval and the International Conference on Knowledge Management and Information Sharing - Volume 1: KDIR, (IC3K 2013) ISBN 978-989-8565-75-4, pages 119-127. DOI: 10.5220/0004542201190127


in Bibtex Style

@conference{kdir13,
author={Anshuman Saxena and Ashish Bindal and Alain Wegmann},
title={A Cognitive Reference based Model for Learning Compositional Hierarchies with Whole-composite Tags},
booktitle={Proceedings of the International Conference on Knowledge Discovery and Information Retrieval and the International Conference on Knowledge Management and Information Sharing - Volume 1: KDIR, (IC3K 2013)},
year={2013},
pages={119-127},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004542201190127},
isbn={978-989-8565-75-4},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Knowledge Discovery and Information Retrieval and the International Conference on Knowledge Management and Information Sharing - Volume 1: KDIR, (IC3K 2013)
TI - A Cognitive Reference based Model for Learning Compositional Hierarchies with Whole-composite Tags
SN - 978-989-8565-75-4
AU - Saxena A.
AU - Bindal A.
AU - Wegmann A.
PY - 2013
SP - 119
EP - 127
DO - 10.5220/0004542201190127