Paddy crop and weed classification using color features for computer vision based precision agriculture

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)


Weed detection in paddy fields using robotic vision is still a challenging task. The main reason for this being lack of dataset. In this research work, the creation of paddy crop and weeds image dataset has been described. The images were acquired using a digital camera under natural lighting conditions. For every image in the dataset, its annotated image was created using manual annotation. Annotation was carried out after plant segmentation from the soil background. Dataset consists of 300 images. In addition, classification of paddy crop and two types of weeds namely, sedges and grass-type weeds has been done using only color features. Usually, crop and weed discrimination is done using texture, shape and color features. However, using all three features may result in a computationally intensive system. Color features are extracted using a novel approach based on Speeded-up Robust Features (SURF). Random Forest, K-Nearest Neighbors (K-NN) and Least Squares Support Vector Machine (LSSVM) classifiers have been used for the classification of the paddy crop and two types of weeds. An accuracy around 86% was obtained by all the classifiers. This indicates color features can be relied upon in discrimination between paddy crop and sedges and grass-type weeds.

Original languageEnglish
Pages (from-to)2909-2916
Number of pages8
JournalInternational Journal of Engineering and Technology(UAE)
Issue number4
Publication statusPublished - 01-01-2018

All Science Journal Classification (ASJC) codes

  • Biotechnology
  • Computer Science (miscellaneous)
  • Environmental Engineering
  • Chemical Engineering(all)
  • Engineering(all)
  • Hardware and Architecture


Dive into the research topics of 'Paddy crop and weed classification using color features for computer vision based precision agriculture'. Together they form a unique fingerprint.

Cite this