The ALIEN Visual Tracker application IS OUT!
Download it here: http://www.micc.unifi.it/pernici/
(available for Windows7 64bit).
The ALIEN visual tracker is a generic visual object tracker achieving state of the art performance. The object is selected at run-time by drawing a bounding box around it and then its appearance is learned and tracked as time progresses. The ALIEN tracker has been shown to outperform other competitive trackers, especially in the case of long-term tracking, large amount of camera blur, low frame rate videos and severe occlusions including full object disappearance.
The scientific paper introducing the technology behind the tracker will appear at the 12th European Conference in Computer Vision 2012 under the following title:
• FaceHugger: The ALIEN Tracker Applied to Faces. In Proceedings of European Conference on Computer Vision (ECCV) - DEMO Session -- 2012 Florence Italy.
A real time demo of the released downloadable application (http://www.micc.unifi.it/pernici/) will also be given during the conference .
Video demos showing the capability of this novel technology may be seen here http://www.youtube.com/user/pernixVision.
To reach the machine capabilities to those of humans we need to integrate all the five senses of the human to the machines. Then it will be attractive and use full in a better way.With a video camera We need mic with sound processing,smell analysis,skins feeling analysis,taste analysis extras to make a decision about the current scenario. A robot with full of sensors and a computing system with AI can make a robot close near to humans.
Computer vision is at a very good stage compared to the time a decade before, and it has gained this stage at a very fast increasing rate. But still if some one think out of box then he will surely get setback as computer vision has its own limitation and one can use it considering those limitations. Yes there is no doubt that there is enormous scope of improvement and that will take place in coming time as it is the area of interest of many researchers.
Yes, Kinect is often sited as the first major proof point embedded vision has gone mainstream. But the computer's "understanding" of and ability to intelligently act on what it "sees" is still very rudimentary.
A factor to consider is the nature of the object being carried by one person towards another. Holding a wrapped gift out in front as one person approaches other suggests the possibility of a transfer (gift giving). The facial expression and degree of engagement with that person might as well. In contrast, a person with a computer case over their shoulder or a wedding ring on their hand would not be expected to be about to give those items to another individual they were approaching.
All these sounds quite interesting but also quite difficult. The human brain learns through years a child develops. Teaching that to a computer sounds to me as a gigantic task. And add to this the fact that we don't fully understand how the human brain works. The brain is a difficult subject to study as we only have brains to study it. It's a lot easier when the system under study is less complex than the tool or system being used for studying it.
NASA's Orion Flight Software Production Systems Manager Darrel G. Raines joins Planet Analog Editor Steve Taranovich and Embedded.com Editor Max Maxfield to talk about embedded flight software used in Orion Spacecraft, part of NASA's Mars mission. Live radio show and live chat. Get your questions ready.
Brought to you by