Details

A Comprehensive Analysis of the Machine Learning Tools and Techniques in Enhancing the Cumulative Effectiveness of Natural Language Processing (NLP)

Saniya Malik

DAV Police Public School, Gurugram

52-55

Vol: 13, Issue: 2, 2023

Receiving Date: 2023-06-02 Acceptance Date:

2023-06-21

Publication Date:

2023-06-29

Download PDF

http://doi.org/10.37648/ijrst.v13i02.007

Abstract

This paper deeply shows the calculations used in Normal Language (NLU) utilizing AI (ML) to enable Normal Language applications like thoughtful investigation, text grouping and question responding. The paper completely examines the various applications, inborn difficulties, and promising possibilities of AI in NLU, giving significant knowledge into its progressive effect on language handling and perception.

References

  1. 'Attention Is All You Need' by Vaswani et al. (2017). Journal: Advances in Neural Information Processing Systems (NeurIPS).
  2. 'BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding' by Devlin et al. (2018). Journal: NAACL-HLT (North American Chapter of the Association for Computational Linguistics: Human Language Technologies).
  3. 'GloVe: Global Vectors for Word Representation' by Pennington et al. (2014). Journal: Empirical Methods in Natural Language Processing (EMNLP).
  4. 'Word2Vec' by Mikolov et al. (2013). Journal: Neural Information Processing Systems (NeurIPS).
  5. 'Sequence to Sequence Learning with Neural Networks' by Sutskever et al. (2014). Journal: Advances in Neural Information Processing Systems (NeurIPS).
  6. 'Deep Residual Learning for Image Recognition' by He et al. (2016). Journal: IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
  7. 'A Neural Probabilistic Language Model' by Bengio et al. (2003). Journal: Journal of Machine Learning Research (JMLR).
  8. 'Recurrent Neural Network based Language Model' by Mikolov et al. (2010). Journal: Interspeech.
  9. 'Efficient Estimation of Word Representations in Vector Space' by Mikolov et al. (2013). Journal: arXiv preprint arXiv:1301.3781
  10. 'Convolutional Neural Networks for Sentence Classification' by Kim (2014). Journal: Empirical Methods in Natural Language Processing (EMNLP).
  11. K. He, X. Zhang, S. Ren, and J. Sun. Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. arXiv preprint arXiv:1502.01852, 2015
  12. K. Heafield, I. Pouzyrevsky, J. H. Clark, and P. Koehn. Scalable modified Kneser-Ney language model estimation. In Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics, pages 690–696, Sofia, Bulgaria, August 2013.
  13. R. Kiros, R. Salakhutdinov, and R. Zemel. Multimodal neural language models. In ICML’2014, 2014.
  14. R. Jozefowicz, W. Zaremba, and I. Sutskever. An empirical exploration of recurrent network architectures. In Proceedings of the 32nd International Conference on Machine Learning (ICML-15), pages 2342–2350, 2015.
Back

Disclaimer: Indexing of published papers is subject to the evaluation and acceptance criteria of the respective indexing agencies. While we strive to maintain high academic and editorial standards, International Journal of Research in Science and Technology does not guarantee the indexing of any published paper. Acceptance and inclusion in indexing databases are determined by the quality, originality, and relevance of the paper, and are at the sole discretion of the indexing bodies.

We are one of the best in the field of watches and we take care of the needs of our customers and produce replica watches of very good quality as per their demands.