Subsurface drip irrigation systems, compared to other irrigation systems (basin and furrow), enhance the delivery of water and nutrients directly into the root zone. The purposes of this study were to determine wetting front advancement in a subsurface drip irrigation and to compare the results with the HYDRUS 2D model simulation. In this study, the irrigation using T-Tape was carried out on a sandy-loam soil by two emitters at different irrigation times. The Wet moisture meter device was used to determine the soil water content. Evaluation of the simulated and measured soil water content was performed by using the adjusted determination coefficient (R2), relative error (RE), and the normalized root mean square error (NRMSE). Based on the results, the NRMSE of soil water content prediction for the emitters at the depths of 20 and 40 cm was calculated to be in the range of 10 to 19 and 10 to 13 percent, respectively. Also, RE for the emitters at depths of 20 and 40 cm was in the range of -16 to -5 and 8 to 11 percent, respectively. The average R2 for the emitters at depths of 20 and 40 cm was calculated to be 0.87 and 0.98, respectively. Also, five scenarios (F1, F2, T1, T2 and S1) were evaluated to assess the amount of water stored in the soil profile and water mass balance. The results indicated that the model could be used to predict the soil water content subsurface drip irrigation.
Type of Study:
Research |
Subject:
Ggeneral Received: 2016/06/11 | Accepted: 2018/03/14 | Published: 2019/03/15