When I was looking at the details of financial number crunching applications on FPGAs the limiting factor always seemed to be the memory interface. Going on and off chip for big chunks of memory to feed the calculation engines was an issue. If the memory access was predictable (like an FFT) you could always pipeline things, but financial calculations required indirections and the resulting additional access time was a killer. Maybe new algorithms are available now to minimize this effect. Anyone out there have an update (or is it too secret to tell..).
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.