A Comprehensive Analysis of the Machine Learning Tools and Techniques in Enhancing the Cumulative Effectiveness of Natural Language Processing (NLP)
Saniya Malik
DAV Police Public School, Gurugram
52-55
Vol: 13, Issue: 2, 2023
Receiving Date:
2023-06-02
Acceptance Date:
2023-06-21
Publication Date:
2023-06-29
Download PDF
http://doi.org/10.37648/ijrst.v13i02.007
Abstract
This paper deeply shows the calculations used in Normal Language (NLU) utilizing AI (ML) to enable Normal Language applications like thoughtful investigation, text grouping and question responding. The paper completely examines the various applications, inborn difficulties, and promising possibilities of AI in NLU, giving significant knowledge into its progressive effect on language handling and perception.
References
- 'Attention Is All You Need' by Vaswani et al. (2017). Journal: Advances in Neural Information Processing Systems (NeurIPS).
- 'BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding' by Devlin et al. (2018). Journal: NAACL-HLT (North American Chapter of the Association for Computational Linguistics: Human Language Technologies).
- 'GloVe: Global Vectors for Word Representation' by Pennington et al. (2014). Journal: Empirical Methods in Natural Language Processing (EMNLP).
- 'Word2Vec' by Mikolov et al. (2013). Journal: Neural Information Processing Systems (NeurIPS).
- 'Sequence to Sequence Learning with Neural Networks' by Sutskever et al. (2014). Journal: Advances in Neural Information Processing Systems (NeurIPS).
- 'Deep Residual Learning for Image Recognition' by He et al. (2016). Journal: IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
- 'A Neural Probabilistic Language Model' by Bengio et al. (2003). Journal: Journal of Machine Learning Research (JMLR).
- 'Recurrent Neural Network based Language Model' by Mikolov et al. (2010). Journal: Interspeech.
- 'Efficient Estimation of Word Representations in Vector Space' by Mikolov et al. (2013). Journal: arXiv preprint arXiv:1301.3781
- 'Convolutional Neural Networks for Sentence Classification' by Kim (2014). Journal: Empirical Methods in Natural Language Processing (EMNLP).
- K. He, X. Zhang, S. Ren, and J. Sun. Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. arXiv preprint arXiv:1502.01852, 2015
- K. Heafield, I. Pouzyrevsky, J. H. Clark, and P. Koehn. Scalable modified Kneser-Ney language model estimation. In Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics, pages 690–696, Sofia, Bulgaria, August 2013.
- R. Kiros, R. Salakhutdinov, and R. Zemel. Multimodal neural language models. In ICML’2014, 2014.
- R. Jozefowicz, W. Zaremba, and I. Sutskever. An empirical exploration of recurrent network architectures. In Proceedings of the 32nd International Conference on Machine Learning (ICML-15), pages 2342–2350, 2015.
Back