Please use this identifier to cite or link to this item:
http://repository.ipb.ac.id/handle/123456789/171170Full metadata record
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.advisor | Setiawan, Sonni | - |
| dc.contributor.advisor | Lubis, Sandro Wellyanto | - |
| dc.contributor.author | Hardiano, Akhdan Fadhilah Yaskur | - |
| dc.date.accessioned | 2025-09-25T08:29:34Z | - |
| dc.date.available | 2025-09-25T08:29:34Z | - |
| dc.date.issued | 2025 | - |
| dc.identifier.uri | http://repository.ipb.ac.id/handle/123456789/171170 | - |
| dc.description.abstract | Dalam penelitian ini, kami mengembangkan dan mengevaluasi model prediksi curah hujan harian menggunakan arsitektur deep learning, khususnya membandingkan model Long Short-Term Memory (LSTM) dan Transformer dengan berbagai prediktor atmosferik. Hasil menunjukkan bahwa LSTM menghasilkan akurasi yang lebih tinggi pada lag jangka pendek, dengan R² mencapai 0.94 dan RMSE sebesar 4.81 pada lag-3, sementara Transformer menunjukkan kinerja yang lebih konsisten di semua input lag, mempertahankan nilai R² yang stabil sekitar 0.87–0.88. Aplikasi pre-processing smoothing data selama 5 hari berkontribusi secara signifikan dalam peningkatan akurasi prediksi pada kedua model, terutama untuk LSTM, yang lebih sensitif terhadap fluktuasi data asli. Penambahan variabel gelombang tropis tidak secara signifikan meningkatkan kinerja model dan dapat menurunkan akurasi LSTM pada lag yang lebih panjang akibat kompleksitas input yang meningkat. Sebaliknya, Transformer tetap relatif tahan terhadap variasi tersebut. Di antara semua prediktor, vertically integrated moisture flux divergence (VIMD) merupakan prediktor yang paling berpengaruh, menekankan relevansi fisiknya terhadap proses presipitasi konvektif di wilayah monsun. Temuan ini menyoroti bahwa meskipun LSTM unggul dalam menangkap dinamika temporal jangka pendek, Transformer menawarkan kerangka kerja stabil untuk peramalan curah hujan jangka panjang. | - |
| dc.description.abstract | In this study, we develop and evaluate daily rainfall prediction models using deep learning architectures, specifically comparing Long Short-Term Memory (LSTM) and Transformer models with various atmospheric predictors. The results show that LSTM yields higher accuracy at short-term lags, with R² reaching 0.94 and RMSE as low as 4.81 at lag-3, while the Transformer demonstrates more consistent performance across all lags, maintaining stable R² values around 0.87 0.88. Applying a 5-day smoothing pre-processing step significantly enhances prediction quality for both models, particularly for LSTM, which is more sensitive to fluctuations in raw data. Adding tropical wave variables does not substantially improve performance of the model and can reduce LSTM accuracy at longer lags due to increased input complexity. In contrast, the Transformer remains relatively robust to these variations. Among all predictors, the vertically integrated moisture flux divergence (VIMD) stands as the most important predictor, emphasizing its physical relevance to precipitation processes in convective and monsoonal regions. These findings highlight that while LSTM excels at capturing short-term temporal dynamics, the Transformer offers a stable framework for longer-range rainfall forecasting. | - |
| dc.description.sponsorship | null | - |
| dc.language.iso | id | - |
| dc.publisher | IPB University | id |
| dc.title | LEVERAGING SEQUENTIAL AND ATTENTION-BASED DEEP LEARNING ARCHITECTURES FOR SKILLFULL DAILY RAINFALL PREDICTION IN JAKARTA, INDONESIA USING ATMOSPHERIC PREDICTORS | id |
| dc.title.alternative | null | - |
| dc.type | Skripsi | - |
| dc.subject.keyword | deep learning | id |
| dc.subject.keyword | LSTM | id |
| dc.subject.keyword | Rainfall prediction | id |
| dc.subject.keyword | Transformer | id |
| dc.subject.keyword | VIMD | id |
| Appears in Collections: | UT - Geophysics and Meteorology | |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| cover_G2401211080_ae27f2b9043a4fb69446df93cc03c1f8.pdf | Cover | 2.58 MB | Adobe PDF | View/Open |
| fulltext_G2401211080_d329b949e3ee42c9a30c43946bd01c87.pdf Restricted Access | Fulltext | 8.5 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.