I'd like to write about architecture side and performance at first.
The 3DPLD is probablly same as DPGA (in general, called multi-context FPGA) researched by Prof. A.DeHon (when he was PhD @ MIT, and currentlly he is in Penn Univ) who is one of evangelists of reconfigurable computing. At the same time (maybe 1996-97), Xilinx researcher also proposed time-multiplexed FPGA, moreover, earlly in 90s, Fujitsu got Patent for the multi-context approach. Developping architecture is easy, but how to schedule the context (task) is difficult for dynamically changing the context without deadlock and so on. I remember that some researcher has proposed such "3D placement" that 2D field of programming + time domain at previous decades.
At view point of performance on chip-level not inside of it, throughput is decreased by the number of contexts, of course. And the dynamically changing is OVERHEAD on space and time in general, and thus breaking data-flows on the chip. Researchers want not to make dynamically change, so device is treated as possible as static device even if a device can change dynamically, in order to reduce the overheads. Some researcher concluded, "we must find killer application to use the dynamically change (thus reconfiguration) effectivelly". Unfortunatelly, such application is not yet found, time is past (one decade at least). In addition, I think key is how to use an inter-configuration communication proposed by B.Hutching (Prof. @ BYU).
Regarding marketing side,
3D or DPGA must claim about application designers need not familiar with the device or its system, this means the device have to support or include current design process. So, the design tool will be more complex, or more tool-chains is necessary, probablly taking more time. Application designers face to time-to-market issue, no choice if it is not possible. So, not only killer application, but also traditional application should be enhanced at the design time or performance perspectives.
With all the respect, but is this not a dynamically reconfigurable FPGA that NEC electronics already commercializes and calls it STP (stream transpose processor)?
The startup companies that introduce new reconfigurable architecture have a difficult time, because it's potential customers have established processes for existing technology that this new technology does not readily map to. Nobody wants to change the process. So unless these companies also offer the entire array of CAD tools to program their fabric, they may not succeed.
Failure of so many roconfigurable architectures seems to be about finding the right technology, educating the potential customers and strong, strong marketing more than about the specific architecture. This approach sounds a lot like Chameleon and they couldn't convince the users
Replay available now: A handful of emerging network technologies are competing to be the preferred wide-area connection for the Internet of Things. All claim lower costs and power use than cellular but none have wide deployment yet. Listen in as proponents of leading contenders make their case to be the metro or national IoT network of the future. Rick Merritt, EE Times Silicon Valley Bureau Chief, moderators this discussion. Join in and ask his guests questions.