Improving Web Image Search Results Using Query-Relative Classifiers

Improving Web Image Search Results Using Query-Relative Classifiers

Place: Large Lecture Room - CVC

Affiliation: INRIA Grenoble, France  

With the growth of multimedia content on the web, image web search using text queries has received considerable attention. However, current approaches that use visual information require that relevance model is trained for every new query, and are therefore unsuitable for real-world web search applications. The idea I'll present in this talk is to use classifiers that are based on query-relative features which can be used to re-rank images retrieved by new text queries without any additional training. The query-relative features are combination of text query-relative features, based on the occurrence of query terms in web pages and image meta-data, and query-relative features derived from bag-of-words representations of images. For evaluation purposes we use a new database which includes 71478 images returned by a web search engine for 353 different search queries, along with their meta-data and ground-truth annotations. Using this data set, we compared the image ranking performance of our model with that of the search engine, and with an approach that learns a separate classifier for each query. Our model that uses query-relative features improve significantly over the raw search engine ranking, and also outperform the query-specific models.