Show simple item record

dc.contributor.advisorSetiawan, Sonni
dc.contributor.advisorLubis, Sandro Wellyanto
dc.contributor.authorHardiano, Akhdan Fadhilah Yaskur
dc.date.accessioned2025-09-25T08:29:34Z
dc.date.available2025-09-25T08:29:34Z
dc.date.issued2025
dc.identifier.urihttp://repository.ipb.ac.id/handle/123456789/171170
dc.description.abstractDalam penelitian ini, kami mengembangkan dan mengevaluasi model prediksi curah hujan harian menggunakan arsitektur deep learning, khususnya membandingkan model Long Short-Term Memory (LSTM) dan Transformer dengan berbagai prediktor atmosferik. Hasil menunjukkan bahwa LSTM menghasilkan akurasi yang lebih tinggi pada lag jangka pendek, dengan R² mencapai 0.94 dan RMSE sebesar 4.81 pada lag-3, sementara Transformer menunjukkan kinerja yang lebih konsisten di semua input lag, mempertahankan nilai R² yang stabil sekitar 0.87–0.88. Aplikasi pre-processing smoothing data selama 5 hari berkontribusi secara signifikan dalam peningkatan akurasi prediksi pada kedua model, terutama untuk LSTM, yang lebih sensitif terhadap fluktuasi data asli. Penambahan variabel gelombang tropis tidak secara signifikan meningkatkan kinerja model dan dapat menurunkan akurasi LSTM pada lag yang lebih panjang akibat kompleksitas input yang meningkat. Sebaliknya, Transformer tetap relatif tahan terhadap variasi tersebut. Di antara semua prediktor, vertically integrated moisture flux divergence (VIMD) merupakan prediktor yang paling berpengaruh, menekankan relevansi fisiknya terhadap proses presipitasi konvektif di wilayah monsun. Temuan ini menyoroti bahwa meskipun LSTM unggul dalam menangkap dinamika temporal jangka pendek, Transformer menawarkan kerangka kerja stabil untuk peramalan curah hujan jangka panjang.
dc.description.abstractIn this study, we develop and evaluate daily rainfall prediction models using deep learning architectures, specifically comparing Long Short-Term Memory (LSTM) and Transformer models with various atmospheric predictors. The results show that LSTM yields higher accuracy at short-term lags, with R² reaching 0.94 and RMSE as low as 4.81 at lag-3, while the Transformer demonstrates more consistent performance across all lags, maintaining stable R² values around 0.87 0.88. Applying a 5-day smoothing pre-processing step significantly enhances prediction quality for both models, particularly for LSTM, which is more sensitive to fluctuations in raw data. Adding tropical wave variables does not substantially improve performance of the model and can reduce LSTM accuracy at longer lags due to increased input complexity. In contrast, the Transformer remains relatively robust to these variations. Among all predictors, the vertically integrated moisture flux divergence (VIMD) stands as the most important predictor, emphasizing its physical relevance to precipitation processes in convective and monsoonal regions. These findings highlight that while LSTM excels at capturing short-term temporal dynamics, the Transformer offers a stable framework for longer-range rainfall forecasting.
dc.description.sponsorship
dc.language.isoid
dc.publisherIPB Universityid
dc.titleLEVERAGING SEQUENTIAL AND ATTENTION-BASED DEEP LEARNING ARCHITECTURES FOR SKILLFULL DAILY RAINFALL PREDICTION IN JAKARTA, INDONESIA USING ATMOSPHERIC PREDICTORSid
dc.title.alternative
dc.typeSkripsi
dc.subject.keyworddeep learningid
dc.subject.keywordLSTMid
dc.subject.keywordRainfall predictionid
dc.subject.keywordTransformerid
dc.subject.keywordVIMDid


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record