Video retrieval based on patterns of oriented edge magnitude

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    In this work, a video retrieval system is proposed based on POEM (Patterns of Oriented Edge Magnitudes) descriptor. In the first stage, the input video is partitioned into shots based on Gabor moments and keyframes are selected from each shot based on Temporally Maximum Occurrence Frame (TMOF). In the next stage, the POEM descriptor is computed from each keyframe for robust image/frame representation. Given a query frame, the descriptor is obtained from it in a similar manner, and this descriptor is compared with the descriptors of the video keyframes using nearest neighbour matching technique to find the matching keyframe. We have conducted experiments on the TRECVID video segments to exhibit the superiority of the proposed approach for video retrieval applications.

    Original languageEnglish
    Title of host publicationProceedings of the 3rd International Symposium on Computer Vision and the Internet, VisionNet 2016
    PublisherAssociation for Computing Machinery
    Pages115-120
    Number of pages6
    ISBN (Electronic)9781450343015
    DOIs
    Publication statusPublished - 21-09-2016
    Event3rd International Symposium on Computer Vision and the Internet, VisionNet 2016 - Jaipur, India
    Duration: 21-09-201624-09-2016

    Publication series

    NameACM International Conference Proceeding Series
    Volume21-24-September-2016

    Conference

    Conference3rd International Symposium on Computer Vision and the Internet, VisionNet 2016
    Country/TerritoryIndia
    CityJaipur
    Period21-09-1624-09-16

    All Science Journal Classification (ASJC) codes

    • Software
    • Human-Computer Interaction
    • Computer Vision and Pattern Recognition
    • Computer Networks and Communications

    Fingerprint

    Dive into the research topics of 'Video retrieval based on patterns of oriented edge magnitude'. Together they form a unique fingerprint.

    Cite this