I beg to differ. I have implemented both DDC and DUC for CDMA/LTE radio head handling multiple sector-carriers withing single FPGA. It is way much easier to write and maintain code with RTL than schematic based. You take advantage of Verilog generate statement and parametrizable number of sectors/carriers. Plug in IP Core from megawizard, wrap it with RTL and you are done. It was horrible for me to reverse engineer legacy automatic gain control code done in Simulink(schematic based) which was multiple Letter size pages to read. RTL is simply more compact and easier to search for/analyze. Schematic based design is rather thing of the past, abandoned for RTL in 90's. It is used by marketing I guess to attract people not familiar with RTL coding. For those it is lower learning curve at beginning but later as they get familiar with VHDL/Verilog (they will need to, since in real world it is unlikely Simulink/schematic design will be only thing in the chip) they realize HDL allows them for finer grain control and more reusability than schematic. Also, tracing paths failing timing could be hard since they may not match b/w schematic and generated RTL.
I took over legacy DDC design done in DSP builder 5.0.0. Matlab 2008 and newer do not support it. Altera Support told me to install older Matlab and older Quartus and convert project from DSP builder 5.0 to 7.1 and then from 7.1 to 8. I never managed to do that successfully. Had legacy project been done in Verilog I would be able to just reuse it. Ended up doing new DDC from scratch.
Simulink has one big advantage though, simulation of DSP datapath is much faster than RTL. This is because of those damn Altera/Xilinx simulation models where single FIR filter has over 100k lines of Verilog generated code mapped to native device primitives effectively transforming your simulation to gate-level like with its mediocre speed.
Hi Dr. DSP, Our tools are by no means limited to this kind of DSP. There are a number of built-in functions, as well as the ability to import external IP, including that provided by the Xilinx CORE Generator. Other applications include scientific research, ultrasound and non-destructive test, beamforming, frequency-domain transforms and measurements, real-time modulation and demodulation... basically anything benefiting from the real-time performance or DSP capabilities of an FPGA. Like any abstraction, there are some performance tradeoffs, but by incorporating parameterizable Xilinx IP, you do get a bit of the best of both worlds. If you're interested in evaluating LabVIEW, you can download a free, 30-day trial at ni.com/trylabview. You can also install the LabVIEW FPGA module, found here: https://lumen.ni.com/nicif/us/evaltlktembdes/content.xhtml. I have plans to post the code I used here, though I've yet to get to that. Let me know if you're interested. And if Max will have me ;) , I'll post here on more applications in the future!
Nice to see the system-oriented graphical design capture approach being promoted, and congrats to NI for creating such a thorough tool suite that integrates well with Xilinx Core Gen and with real-world lab signals.
Ever since graphical design capture for hardware design first appeared with SPW back in the early '90s, I have been trying to persuade ASIC designers that drawing block diagram pictures is a lot more productive and less error-prone than writing RTL code.
For some reason, it is a tough sell. The average ASIC or FPGA digital designer thinks "real men write code" -- even after you demonstrate all the advantages of a higher level of abstraction.
Even today, when you talk to someone about ESL, they start rolling out the virtues of System C or System Verilog -- but it still amounts to a bunch of code that must be written, and no ability to visualize the system.
When you look at the average DSP application, you see datapaths with a lot of number-crunching, feedback, dynamic range and precision adjustments, and you naturally want & need to draw a block diagram so that you can visualize the system and its complexities and dependencies.
If you're going to draw those pictures anyway, why not make the pictures be your source database? Nothing could be more natural!
This approach seems to really deliver on the capabilities of FPGAs in DSP applications. Now if we can get the same kind of thing for other applications (where you can specify, generate, test, and deploy without touching things like gates or RTL code) design efficieny will jump an order of magnitude). Any other apps we can target with this approach?
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.