A new type of search engine is being developed that can interrogate networks of sensors to give real time answers to questions about the physical world.
The European-funded project, known as SMART, for 'Search engine for MultimediA Environment geneRated content', aims to develop and implement an open source system to allow internet users to search and analyse data from any network of sensors.
By matching search queries with information from sensors and cross-referencing data from social networks such as Twitter, users will be able to receive detailed responses to questions such as 'What part of the city hosts live music events which my friends have been to recently?' or 'How busy is the city centre?' Currently, standard search engines such as Google are not able to answer search queries of this type.
The SMART project is a joint research initiative of nine partners including the University of Glasgow, Atos, Athens Information Technology, IBM's Haifa Research Lab, Imperial College London, City of Santander, PRISA Digital, Telesto and Consorzio S3 Log. "The SMART project will be built upon an open-source search engine technology known as Terrier we have been developing at the University since 2004, and we're pleased to be involved in this innovative research initiative," said Dr Iadh Ounis, of the University of Glasgow's School of Computing Science.
"The SMART engine will be able to answer high-level queries by automatically identifying cameras, microphones and other sensors that can contribute to the query, then synthesising results stemming from distributed sources in an intelligent way. SMART builds upon the existing concept of 'smart cities', physical spaces which are covered in an array of intelligent sensors which communicate with each other and can be searched for information. The search results sourced from these smart cities can be reused across multiple applications, making the system more effective. We expect that SMART will be tested in a real city by 2014."
SMART will develop a scalable search and retrieval architecture for multimedia data, along with intelligent techniques for real-time processing, search and retrieval of physical world multimedia. The SMART framework will boost scalability in both functional and business terms, while being extensible in terms of sensors and multimedia data processing algorithms.
The matching will be based on the sensors' context and metadata (e.g., location, state, capabilities), as well as on the dynamic context of the physical world as the later is perceived by processing algorithms (such as face detectors, person trackers, classifiers of acoustic events and components for crowd analysis). At the same time, SMART will be able to leverage Web2.0 social networks information in order to facilitate social queries on physical world multimedia.
The SMART project is part of the University of Glasgow's growing theme of research on sensor systems. The University aims to ensure that its research portfolio can provide entire sensor solutions, from novel physical sensors, to intelligent applications and visualisations of sensor inputs. The University is also part of the Scottish Sensor Systems Centre, which is funded by the Scottish Funding Council and collaboration between eight of Scotland's leading universities and industry to undertake joint industrial/academic projects into sensor systems.
For more information on SMART: http://www.smartfp7.eu.
For more information on the Scottish Sensor Systems Centre: sensorsystems.org.uk.
This article originally appeared on EE Times Europe.