Please use this identifier to cite or link to this item: http://repository.ipb.ac.id/handle/123456789/171170
Title: LEVERAGING SEQUENTIAL AND ATTENTION-BASED DEEP LEARNING ARCHITECTURES FOR SKILLFULL DAILY RAINFALL PREDICTION IN JAKARTA, INDONESIA USING ATMOSPHERIC PREDICTORS
Other Titles: 
Authors: Setiawan, Sonni
Lubis, Sandro Wellyanto
Hardiano, Akhdan Fadhilah Yaskur
Issue Date: 2025
Publisher: IPB University
Abstract: Dalam penelitian ini, kami mengembangkan dan mengevaluasi model prediksi curah hujan harian menggunakan arsitektur deep learning, khususnya membandingkan model Long Short-Term Memory (LSTM) dan Transformer dengan berbagai prediktor atmosferik. Hasil menunjukkan bahwa LSTM menghasilkan akurasi yang lebih tinggi pada lag jangka pendek, dengan R² mencapai 0.94 dan RMSE sebesar 4.81 pada lag-3, sementara Transformer menunjukkan kinerja yang lebih konsisten di semua input lag, mempertahankan nilai R² yang stabil sekitar 0.87–0.88. Aplikasi pre-processing smoothing data selama 5 hari berkontribusi secara signifikan dalam peningkatan akurasi prediksi pada kedua model, terutama untuk LSTM, yang lebih sensitif terhadap fluktuasi data asli. Penambahan variabel gelombang tropis tidak secara signifikan meningkatkan kinerja model dan dapat menurunkan akurasi LSTM pada lag yang lebih panjang akibat kompleksitas input yang meningkat. Sebaliknya, Transformer tetap relatif tahan terhadap variasi tersebut. Di antara semua prediktor, vertically integrated moisture flux divergence (VIMD) merupakan prediktor yang paling berpengaruh, menekankan relevansi fisiknya terhadap proses presipitasi konvektif di wilayah monsun. Temuan ini menyoroti bahwa meskipun LSTM unggul dalam menangkap dinamika temporal jangka pendek, Transformer menawarkan kerangka kerja stabil untuk peramalan curah hujan jangka panjang.
In this study, we develop and evaluate daily rainfall prediction models using deep learning architectures, specifically comparing Long Short-Term Memory (LSTM) and Transformer models with various atmospheric predictors. The results show that LSTM yields higher accuracy at short-term lags, with R² reaching 0.94 and RMSE as low as 4.81 at lag-3, while the Transformer demonstrates more consistent performance across all lags, maintaining stable R² values around 0.87 0.88. Applying a 5-day smoothing pre-processing step significantly enhances prediction quality for both models, particularly for LSTM, which is more sensitive to fluctuations in raw data. Adding tropical wave variables does not substantially improve performance of the model and can reduce LSTM accuracy at longer lags due to increased input complexity. In contrast, the Transformer remains relatively robust to these variations. Among all predictors, the vertically integrated moisture flux divergence (VIMD) stands as the most important predictor, emphasizing its physical relevance to precipitation processes in convective and monsoonal regions. These findings highlight that while LSTM excels at capturing short-term temporal dynamics, the Transformer offers a stable framework for longer-range rainfall forecasting.
URI: http://repository.ipb.ac.id/handle/123456789/171170
Appears in Collections:UT - Geophysics and Meteorology

Files in This Item:
File Description SizeFormat 
cover_G2401211080_ae27f2b9043a4fb69446df93cc03c1f8.pdfCover2.58 MBAdobe PDFView/Open
fulltext_G2401211080_d329b949e3ee42c9a30c43946bd01c87.pdf
  Restricted Access
Fulltext8.5 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.