Texas Journal of Engineering and Technology
https://zienjournals.com/index.php/tjet
<p><strong>Texas Journal of Engineering and Technology</strong> provides the opportunity to the authors to publish their original research articles in field of Engineering and Technological Advancements. </p> <p><strong>Frequency of Publication:</strong> Monthly</p> <p><strong>Acceptance Notification:</strong> 45 Days</p> <p><strong>ISSN (Online):</strong> <a href="https://portal.issn.org/resource/ISSN/2770-4491">2770-4491</a></p> <p><strong>doi Prefix:</strong> 10.62480</p> <p> </p>Zien Journalsen-USTexas Journal of Engineering and Technology2770-4491<h3><strong>User Rights</strong></h3> <p>Under the <strong>Creative Commons Attribution-<a href="https://creativecommons.org/licenses/by-nc/4.0/">NonCommercial 4.0 International (CC-BY-NC)</a></strong><a href="https://creativecommons.org/licenses/by-nc/4.0/">,</a> the author (s) and users are free to share (copy, distribute and transmit the contribution).</p> <p><strong>Rights of Authors</strong></p> <p>Authors retain the following rights:<br />1. Copyright and other proprietary rights relating to the article, such as patent rights,<br />2. the right to use the substance of the article in future works, including lectures and books,<br />3. the right to reproduce the article for own purposes, provided the copies are not offered for sale,<br />4. the right to self-archive the article.</p>Development Of Deep Learning Models And Algorithms For Language Processing In Uzbek
https://zienjournals.com/index.php/tjet/article/view/5873
<p>This article focuses on the development of deep learning models and algorithms specifically designed for Uzbek language processing within the IT field. A comprehensive approach involving data collection, preprocessing, model selection, and evaluation was employed. Experiments with RNN, LSTM, and transformer-based models like BERT and GPT were conducted, with transformer models yielding superior results. Key challenges included limited datasets and the complex morphological structure of Uzbek. The findings suggest that fine-tuned transformer models, especially with language-specific preprocessing, can significantly improve performance in language understanding tasks for low-resource languages</p>Suyunova ZamiraErkinova Dilnoza
Copyright (c) 2025 Suyunova Zamira, Erkinova Dilnoza
https://creativecommons.org/licenses/by-nc/4.0
2025-01-102025-01-104014