Max, I would love to understand how you are going to implement depth perception with the two Pixy Machines! That should be a cool programming project, I am wondering what vision libraries were you planning on using? The attraction to the Kinect is the built in horsepower/processing for the depth reporting. Sounds like a fun project for sure.
@Robotics Developer: ...the areas of software, vision, mechanical and systems for robot designs keep me occupied learning and exploring new areas, a wonderful thing for an engineer at heart!
I agree -- one of the reasons for having hobby projects is that it's a great way to learn "stuff" -- for example, working on my my Inamorata Prognostication Engine is affording me the opertunity to learn all sorts of things, like how to calculate the dates of Full Moons and Blue Moons :-)
Max, consider (if you have the bandwidth and processing power) adding a Kinect to the robot. It has nice sensor features (depth being one of the really nice ones) as well as vision. Adding that to the platfrom could provide you with years of experimentation and exploration (pun intended).
You are most welcome! Any help that I can provide please contact me! Robotics encompasses a diverse mix of disciplines and as such it provides an endless opportunity to learn and grow. As a EE by degree the areas of software, vision, mechanical and systems for robot designs keep me occupied learning and exploring new areas, a wonderful thing for an engineer at heart!
Drones are, in essence, flying autonomous vehicles. Pros and cons surrounding drones today might well foreshadow the debate over the development of self-driving cars. In the context of a strongly regulated aviation industry, "self-flying" drones pose a fresh challenge. How safe is it to fly drones in different environments? Should drones be required for visual line of sight – as are piloted airplanes? Join EE Times' Junko Yoshida as she moderates a panel of drone experts.