Max, I would love to understand how you are going to implement depth perception with the two Pixy Machines! That should be a cool programming project, I am wondering what vision libraries were you planning on using? The attraction to the Kinect is the built in horsepower/processing for the depth reporting. Sounds like a fun project for sure.
@Robotics Developer: ...the areas of software, vision, mechanical and systems for robot designs keep me occupied learning and exploring new areas, a wonderful thing for an engineer at heart!
I agree -- one of the reasons for having hobby projects is that it's a great way to learn "stuff" -- for example, working on my my Inamorata Prognostication Engine is affording me the opertunity to learn all sorts of things, like how to calculate the dates of Full Moons and Blue Moons :-)
Max, consider (if you have the bandwidth and processing power) adding a Kinect to the robot. It has nice sensor features (depth being one of the really nice ones) as well as vision. Adding that to the platfrom could provide you with years of experimentation and exploration (pun intended).
You are most welcome! Any help that I can provide please contact me! Robotics encompasses a diverse mix of disciplines and as such it provides an endless opportunity to learn and grow. As a EE by degree the areas of software, vision, mechanical and systems for robot designs keep me occupied learning and exploring new areas, a wonderful thing for an engineer at heart!
@Robotics Developer: ...sequencing of the ultra sonics (say left, back, front, right) to prevent echos from interfering...
Agreed -- in fact we are thinking of givng the sensor boards the ability to automatically keep on performing a "round-robin" ping one after the other and to kep on doing it -- whenever the main processor is ready (or interested) it can simply request of any senso "what was your last reading?"