RT Journal Article T1 Deep convolutional neural network for damaged vegetation segmentation from RGB images based on virtual NIR-channel estimation A1 Picon, Artzai A1 Bereciartua-Perez, Arantza A1 Eguskiza, Itziar A1 Romero-Rodriguez, Javier A1 Jimenez-Ruiz, Carlos Javier A1 Eggers, Till A1 Klukas, Christian A1 Navarra-Mestre, Ramon AB Performing accurate and automated semantic segmentation of vegetation is a first algorithmic step towards more complex models that can extract accurate biological information on crop health, weed presence and phenological state, among others. Traditionally, models based on normalized difference vegetation index (NDVI), near infrared channel (NIR) or RGB have been a good indicator of vegetation presence. However, these methods are not suitable for accurately segmenting vegetation showing damage, which precludes their use for downstream phenotyping algorithms. In this paper, we propose a comprehensive method for robust vegetation segmentation in RGB images that can cope with damaged vegetation. The method consists of a first regression convolutional neural network to estimate a virtual NIR channel from an RGB image. Second, we compute two newly proposed vegetation indices from this estimated virtual NIR: the infrared-dark channel subtraction (IDCS) and infrared-dark channel ratio (IDCR) indices. Finally, both the RGB image and the estimated indices are fed into a semantic segmentation deep convolutional neural network to train a model to segment vegetation regardless of damage or condition. The model was tested on 84 plots containing thirteen vegetation species showing different degrees of damage and acquired over 28 days. The results show that the best segmentation is obtained when the input image is augmented with the proposed virtual NIR channel (F1=0.94) and with the proposed IDCR and IDCS vegetation indices (F1=0.95) derived from the estimated NIR channel, while the use of only the image or RGB indices lead to inferior performance (RGB(F1=0.90) NIR(F1=0.82) or NDVI(F1=0.89) channel). The proposed method provides an end-to-end land cover map segmentation method directly from simple RGB images and has been successfully validated in real field conditions. SN 2589-7217 YR 2022 FD 2022-01 LA eng NO Picon , A , Bereciartua-Perez , A , Eguskiza , I , Romero-Rodriguez , J , Jimenez-Ruiz , C J , Eggers , T , Klukas , C & Navarra-Mestre , R 2022 , ' Deep convolutional neural network for damaged vegetation segmentation from RGB images based on virtual NIR-channel estimation ' , Artificial Intelligence in Agriculture , vol. 6 , pp. 199-210 . https://doi.org/10.1016/j.aiia.2022.09.004 NO Publisher Copyright: © 2022 The Authors DS TECNALIA Publications RD 3 jul 2024