Research Article

Fine-tuning Large Language Models for Turkish Flutter Code Generation

Volume: 8 Number: 4 December 29, 2025
EN

Fine-tuning Large Language Models for Turkish Flutter Code Generation

Abstract

The rapid advancement of large language models (LLMs) for code generation has largely centered on English programming queries. This paper focuses on a low-resource language scenario, specifically Turkish, in the context of Flutter mobile app development. Two representative LLMs (a 4B-parameter multilingual model and a 3B code-specialized model) on a new Turkish question-and-answer dataset for Flutter/Dart are fine-tuned in this study. Fine-tuning with parameter-efficient techniques yields dramatic improvements in code generation quality: Bilingual Evaluation Understudy (BLEU), Recall-Oriented Understudy for Gisting Evaluation (ROUGE-L), Metric for Evaluation of Translation with Explicit Ordering (METEOR), Bidirectional Encoder Representations from Transformers Score (BERTScore), and CodeBLEU scores show significant increases. The rate of correct solutions increased from ~30–70% (for base models) to 80–90% after fine-tuning. The performance trade-offs between models are analyzed, revealing that the multilingual model slightly outperforms the code-focused model in accuracy after fine-tuning. However, the code-focused model demonstrates faster inference speeds. These results demonstrate that even with very limited non-English training data, customizing LLMs can bridge the gap in code generation, enabling high-quality assistance for Turkish developers comparable to that for English. The dataset was released on GitHub to facilitate further research in multilingual code generation.

Keywords

References

  1. J. He, C. Zhou, X. Ma, T. Berg-Kirkpatrick, & G. Neubig, "Towards a unified view of parameter-efficient transfer learning", 2021. doi: 10.48550/arxiv.2110.04366
  2. N. Houlsby, A. Giurgiu, S. Jastrzȩbski, B. Morrone, Q. Laroussilhe, A. Gesmundoet al., "Parameter-efficient transfer learning for nlp", 2019. doi: 10.48550/arxiv.1902.00751
  3. X. Liu, P. He, W. Chen, & J. Gao, "Multi-task deep neural networks for natural language understanding", 2019. doi: 10.18653/v1/p19-1441
  4. M. Anschütz, D. Lozano, & G. Groh, "This is not correct! negation-aware evaluation of language generation systems", 2023. doi: 10.18653/v1/2023.inlg-main.12
  5. Lodha, G. Belapurkar, S. Chalkapurkar, Y. Tao, R. Ghosh, S. Basuet al., "On surgical fine-tuning for language encoders", 2023. doi: 10.18653/v1/2023.findings-emnlp.204
  6. J. Hu, Y. Shen, P. Wallis, Z. Allen-Zhu, Y. Li, S. Wanget al., "Lora: low-rank adaptation of large language models", 2021. doi: 10.48550/arxiv.2106.09685
  7. Y. Hu, Y. Xie, T. Wang, M. Chen, & Z. Pan, "Structure-aware low-rank adaptation for parameter-efficient fine-tuning", Mathematics, vol. 11, no. 20, p. 4317, 2023. doi: 10.3390/math11204317
  8. N. Dhinagar, S. Ozarkar, K. Buwa, S. Thomopoulos, C. Owens‐Walton, E. Laltooet al., "Parameter efficient fine-tuning of transformer-based masked autoencoder enhances resource constrained neuroimage analysis", 2025. doi: 10.1101/2025.02.15.638442

Details

Primary Language

English

Subjects

Computer Software , Software Engineering (Other)

Journal Section

Research Article

Early Pub Date

October 13, 2025

Publication Date

December 29, 2025

Submission Date

June 18, 2025

Acceptance Date

July 14, 2025

Published in Issue

Year 2025 Volume: 8 Number: 4

APA
Uluırmak, B., & Kurban, R. (2025). Fine-tuning Large Language Models for Turkish Flutter Code Generation. Sakarya University Journal of Computer and Information Sciences, 8(4), 637-650. https://doi.org/10.35377/saucis...1722643
AMA
1.Uluırmak B, Kurban R. Fine-tuning Large Language Models for Turkish Flutter Code Generation. SAUCIS. 2025;8(4):637-650. doi:10.35377/saucis.1722643
Chicago
Uluırmak, Bugra, and Rifat Kurban. 2025. “Fine-Tuning Large Language Models for Turkish Flutter Code Generation”. Sakarya University Journal of Computer and Information Sciences 8 (4): 637-50. https://doi.org/10.35377/saucis. 1722643.
EndNote
Uluırmak B, Kurban R (December 1, 2025) Fine-tuning Large Language Models for Turkish Flutter Code Generation. Sakarya University Journal of Computer and Information Sciences 8 4 637–650.
IEEE
[1]B. Uluırmak and R. Kurban, “Fine-tuning Large Language Models for Turkish Flutter Code Generation”, SAUCIS, vol. 8, no. 4, pp. 637–650, Dec. 2025, doi: 10.35377/saucis...1722643.
ISNAD
Uluırmak, Bugra - Kurban, Rifat. “Fine-Tuning Large Language Models for Turkish Flutter Code Generation”. Sakarya University Journal of Computer and Information Sciences 8/4 (December 1, 2025): 637-650. https://doi.org/10.35377/saucis. 1722643.
JAMA
1.Uluırmak B, Kurban R. Fine-tuning Large Language Models for Turkish Flutter Code Generation. SAUCIS. 2025;8:637–650.
MLA
Uluırmak, Bugra, and Rifat Kurban. “Fine-Tuning Large Language Models for Turkish Flutter Code Generation”. Sakarya University Journal of Computer and Information Sciences, vol. 8, no. 4, Dec. 2025, pp. 637-50, doi:10.35377/saucis. 1722643.
Vancouver
1.Bugra Uluırmak, Rifat Kurban. Fine-tuning Large Language Models for Turkish Flutter Code Generation. SAUCIS. 2025 Dec. 1;8(4):637-50. doi:10.35377/saucis. 1722643

 

INDEXING & ABSTRACTING & ARCHIVING

 

31045 31044   ResimLink - Resim Yükle  31047 

31043 28939 28938 34240
 

 

29070    The papers in this journal are licensed under a Creative Commons Attribution-NonCommercial 4.0 International License