Full Paper View Go Back

A Deep Model for Meaning-aware Word Embedding

Amal Bouraoui1 , Salma Jamoussi2 , Abdelmajid Ben Hamadou3

  1. Multimedia Information Systems and Advanced Computing Laboratory (MIRACL), Sfax University, technopole of Sfax, B.P. 242, 3021 , Sfax, Tunisia.
  2. Multimedia Information Systems and Advanced Computing Laboratory (MIRACL), Sfax University, technopole of Sfax, B.P. 242, 3021 , Sfax, Tunisia.
  3. Multimedia Information Systems and Advanced Computing Laboratory (MIRACL), Sfax University, technopole of Sfax, B.P. 242, 3021 , Sfax, Tunisia.

Correspondence should be addressed to: amal.bouraoui@gmaill.com.


Section:Research Paper, Product Type: Journal-Paper
Vol.8 , Issue.2 , pp.25-31, Apr-2020


Online published on Apr 30, 2020


Copyright © Amal Bouraoui, Salma Jamoussi, Abdelmajid Ben Hamadou . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
 

View this paper at   Google Scholar | DPI Digital Library


XML View     PDF Download

How to Cite this Paper

  • IEEE Citation
  • MLA Citation
  • APA Citation
  • BibTex Citation
  • RIS Citation

IEEE Style Citation: Amal Bouraoui, Salma Jamoussi, Abdelmajid Ben Hamadou , “A Deep Model for Meaning-aware Word Embedding,” International Journal of Scientific Research in Computer Science and Engineering, Vol.8, Issue.2, pp.25-31, 2020.

MLA Style Citation: Amal Bouraoui, Salma Jamoussi, Abdelmajid Ben Hamadou "A Deep Model for Meaning-aware Word Embedding." International Journal of Scientific Research in Computer Science and Engineering 8.2 (2020): 25-31.

APA Style Citation: Amal Bouraoui, Salma Jamoussi, Abdelmajid Ben Hamadou , (2020). A Deep Model for Meaning-aware Word Embedding. International Journal of Scientific Research in Computer Science and Engineering, 8(2), 25-31.

BibTex Style Citation:
@article{Bouraoui_2020,
author = {Amal Bouraoui, Salma Jamoussi, Abdelmajid Ben Hamadou },
title = {A Deep Model for Meaning-aware Word Embedding},
journal = {International Journal of Scientific Research in Computer Science and Engineering},
issue_date = {4 2020},
volume = {8},
Issue = {2},
month = {4},
year = {2020},
issn = {2347-2693},
pages = {25-31},
url = {https://www.isroset.org/journal/IJSRCSE/full_paper_view.php?paper_id=1811},
publisher = {IJCSE, Indore, INDIA},
}

RIS Style Citation:
TY - JOUR
UR - https://www.isroset.org/journal/IJSRCSE/full_paper_view.php?paper_id=1811
TI - A Deep Model for Meaning-aware Word Embedding
T2 - International Journal of Scientific Research in Computer Science and Engineering
AU - Amal Bouraoui, Salma Jamoussi, Abdelmajid Ben Hamadou
PY - 2020
DA - 2020/04/30
PB - IJCSE, Indore, INDIA
SP - 25-31
IS - 2
VL - 8
SN - 2347-2693
ER -

348 Views    325 Downloads    80 Downloads
  
  

Abstract :
The context plays an important role in meaning construction and representation. Indeed, words are interpreted differently based on their contexts. Deep neural network have recorded recently a great success in representing the words’ meaning into one vector. Some researchers address the issue of different meaning of a word by constructing different vectors. However, the first category assumes a single state for each word while the second requires either meaning inventories or a preprocessing step to cluster contexts for each word. The latter ignores complicated correlations among words as well as their contexts. In this paper, we investigate another direction consists of representing the word meaning by an evolved vector over its set of context. To this end, we introduce a novel deep model that consists of using auto-encoders recursively based on left/right context around a target word. We evaluate our model on the semantic similarity task. Experimental results demonstrate that our deep model outperforms several competitive models that represent word meaning either with single vector or multiple vectors.

Key-Words / Index Term :
Deep Learning; Word Embedding; Word meaning; deep neural networks; Auto-encoders; Recursive auto-encoders

References :
[1] Z. Harris, “Distributional structure”, Word, Vol. 10, No. 23, pp. 146-162, 1954.
[2] T. Mikolov, W.T. Yih, G. Zweig, “Linguistic regularities in continous space word representations”, NAACL’13, pp. 746-751, 2013.
[3] J. Pennington, R. Socher, C. D. Manning, “Glove: Global vectors for word representation”, EMNLP, Vol. 14, pp. 1532-1543, 2014.
[4] R. Socher, E.H. Huang, J. Pennin, C.D. Manning, A.Y. Ng, “Dynamic pooling and unfolding recursive autoencoders for paraphrase detection”, Advances in Neural Information Processing Systems, pp. 801-809, 2011.
[5] R. Socher, A. Perelygin, J.Y. Wu, J. Chuang, C.D. Manning, A.Y. Ng, C. Potts, “Recursive deep models for semantic compositionality over a sentiment Treebank” , EMNLP, Vol. 1631, pp. 1631-1642, 2013.
[6] R. Socher, K. Andrej, V.L. Quoc, D. M. Christopher, and Y Ng. Andrew, “Grounded compositional semantics for finding and describing images with sentences” , Transactions of the Association for Computational Linguistics, pp. 67-78, 2014.
[7] S. Suriya1, K. Sindhu Meena, “A Novel Approach for Cloud Computing Sentimental Analysis of Twitter Using Long Short-Term Memory and Gate Recurrent Unit” , International Journal of Scientific Research in Computer Science and Engineering, Vol. 8, Issue.1, pp.1-6, 2020.
[8] Y. Bengio, R. Ducharme, P. Vincent, C. Jauvin, “A neural probabilistic language model” , Journal of machine learning research, Vol. 3, pp. 1137-1155, 2003.
[9] P. Bojanowski, E. Grave, A. Joulin, T. Mikolov, “Enriching word vectors with subword information” , Transactions of the Association for Computational Linguistics, Vol. 5, pp. 135-146, 2017.
[10] J. Reisinger, R. J. Mooney, “Multi-prototype vector-space models of word meaning ” , Annual Conference of the North American Chapter of the Association for Computational Linguistics, pp. 109-117, 2010.
[11] E. H. Huang, R. Socher, C. D. Manning, A. Y. Ng, “Improving word representations via global context and multiple word prototypes ” , In The 50th Annual Meeting of the Association for Computational Linguistics, Island, pp. 873-882, 2012.
[12] A. Neelakantan, J. Shankar, A. Passos, A. McCallum, “Efficient non-parametric estimation of multiple embeddings per word in vector space” ,Conference on Empirical Methods in Natural Language, Doha, Qatar, pp. 1059-1069, 2015.
[13] Y. Liu, Z. Liu, T. S. Chua, M. Sun, “Topical word embeddings”, AAAI Conference on Artificial Intelligence, Texas, pp. 2418-2424, 2015.
[14] D. Q. Nguyen, D. Q. Nguyen, A. Modi, S. Thater, M. Pinkal, “A mixture model for learning multi-sense word embeddings”, SEM Association for Computational Linguistics, pp. 121-127, 2017.
[15] D. P. Kingma and J. L. Ba, “Adam: a method for stochastic optimization”, International Conference on Learning Representations, pp. 1-13, 2015.
[16] R. Corley, C. Mihalcea, “Measuring the semantic similarity of texts,” EMSSE, pp. 13-18, 2005.
[17]R. Collobert, J. Weston, L. Bottou, M. Karlen, K. Kavukcuoglu, P. Kuksa, “Natural language processing (almost) from scratch”, Journal of Machine Learning Research, Vol. 12, pp. 2493-2537, 2011.
[18] J. Li, D. Jurafsky, “Do multi-sense embeddings improve natural language understanding,” Empirical Methods in Natural Language, pp. 1722-1732, 2015.
[19] Z. Liu, X. Chen, M. Sun, “A unified model for word sense representation and disambiguation” , EMNLP, pp. 1025-1035, 2014.
[20] F. Guo, M. Iyyer, J. L. Boyd-Graber, “Inducing and embedding senses with scaled gumbel softmax” , ArXiv, pp. 1524-1536, 2018.

Authorization Required

 

You do not have rights to view the full text article.
Please contact administration for subscription to Journal or individual article.
Mail us at  support@isroset.org or view contact page for more details.

Go to Navigation