Annotating photo collections by label propagation according to multiple similarity cues
Proceeding of the 16th ACM international conference on Multimedia - MM '08
This paper considers the emerging problem of annotating personal photo collections that are taken by digital cameras and may have been subsequently organized by customers. Unlike the images from the web searching engine or commercial image banks (e.g. the Corel database), the photos in the same personal collection are related to each other in time, location, and content. Advanced technologies can record the GPS coordinates for each photo, and thus provide a richer source of context to model and
... ontext to model and enforce the correlation between the photos in the same collection. Recognizing the well-known limitations ("semantic gap") of visual recognition algorithms, we exploit the correlation between the photos to enhance the annotation performance. In our approach, high-confidence annotation labels are first obtained for certain photos and then propagated to the remaining photos in the same collection, according to time, location, and visual proximity (or similarity). A novel generative probabilistic model is employed, which outperforms the pervious linear propagation scheme. Experimental results have shown the advantages of the proposed annotation scheme.