LabVIEW is 25 this year, so it's a perfect time to pause for reflection and to speculate on the future.
LabVIEW is 25 this year, so it’s a perfect time to pause for reflection and to speculate on the future.
From the beginning, we thought we were on to something special. The first Mac had recently released, bringing with it the first mainstream GUI and mouse. We knew that personal computers held great potential for use by scientists and engineers, and we were confident that the GUI and mouse would be the primary way people would interact with their computers.
Our thoughts for LabVIEW were simply to combine the power of the PC with the expressiveness of the GUI and hide the unnecessary complications of programming. We wanted to provide a tool that would be as useful to scientists and engineers automating measurements as the spreadsheet was for people working with financial data. Nobody asked us to build LabVIEW. We made it with a “build it and they will come” mentality. We wanted to build a tool that offered a step-function improvement in productivity in designing test and measurement systems.
We didn’t set out to create the G programming language, but that’s where we ended up. We realized we needed that level of flexibility and control to enhance engineers’ productivity and support all of the types of I/O and processing our customers would require. After much deliberation, we settled on a graphical, structured dataflow representation that more closely resembled actual circuits and block diagrams compared to traditional forms of programming.
It’s rewarding to see the many ways our customers have used LabVIEW. From detecting cancer earlier to providing the blind with experiences once thought impossible. From conducting the most advanced experiments in the world that aim to uncover the mysteries of the universe to inspiring our children to be future technology leaders. In return, our customers inspire us to maintain our mission and provide them with continuous productivity improvements and access to the latest technologies.
Until now, most of what we’ve seen in the mainstream market are multiple homogenous cores on a single chip. Specialized processors like GPUs have been reserved for specific tasks and haven’t been readily accessible to programmers. Looking to future technologies, I expect to see more specialized processing cores working alongside general-purpose CPUs to meet the increasing demand for processing power. Additionally, as FPGAs evolve, they will become more prevalent and more capable. To harness the available power, engineers will need productive tools that assist them in partitioning and targeting logic to the appropriate processors. Once harnessed, this increased processing power will help engineers do in real time what they previously had to do offline.
Wireless and mobile technologies are prevalent in our everyday lives. These technologies make it possible to deploy intelligent, sensor-rich devices over large areas. Next-generation networking technologies like IPv6 and high-speed wireless networks promise to support what will become an enormous number of connected devices capable of acquiring and transferring unthinkable amounts of data. I expect we’ll see that this technology could ultimately help prevent hunger and disease, preserve natural resources and improve our quality of life by bringing in additional real-world data to be able to make more timely and better decisions and create more accurate models.
The way people interact with software is another area of rapid technological advancement. In the future, I imagine we’ll see much more efficient input mechanisms. Just as the mouse took over many input duties from the keyboard, I expect to see touch- and gesture-based input become a more prominent aspect of software interaction. Software will become more intuitive, reducing the learning curve as it continues to incorporate physical metaphors.
We made a decision 25 years ago to base the LabVIEW core programming language on structured data flow. The inherent parallelism of data flow was a natural fit for the acquisition-analysis-presentation problem our customers were solving. When the industry moved to multicore machines, that decision made us look downright prescient. Still today and in the future, LabVIEW is naturally positioned to integrate these next-generation technologies and many more that engineers and scientists will rely upon to meet the most difficult challenges they face. That being said, we still have a lot of work to do. As our mission is to equip our customers with tools that accelerate their productivity, innovation and discovery, we are excited to see what the future will bring.
The future is bright.
Father of LabVIEW
If you found this article to be of interest, visit the Test & Measurement Designline where you will find links to relevant technical articles, blogs, new products and news.
You can also get a weekly newsletter highlighting the latest developments in this sector - just Click Here to request this newsletter using the Manage Newsletters tab (if you aren't already a member you'll be asked to register.
LabView sucks. I don’t mean the product or business, but what sucks is the access to LabView to the small company, college classroom, average engineer, or hobbyist. When my college group looked into software for a robotics project, we considered LabView and several other options. LabView, even with student discounts, was just too expensive and the “basic package” just didn’t offer enough functionality without purchasing many confusing, high-dollar modules. We went with Microsoft’s VisualBasic and our project was a hit. Despite many of us loathing Microsoft, VB6 was available, cheap, easy to ramp-up into, and powerful. Fast-forward a decade, I still don’t use LabView because it still looks like an expensive product, again with many high-dollar add-ons, geared towards companies with deep-pockets and dedicated resources. Even though I’m a nobody engineer in the electronics industry and only influence six-figure budgets and a hundred or so other engineers and hobbyists, I have never recommended LabView because of my experience. Sure this is just my anecdote, but I suspect I’m not alone in having been put off by LabView. Luckily, these days with lots of computing power available, there are many graphical programming languages out there – many of which are free, or free-to-try – and many are focused on I/O functionality and DSP.
There seems to be an increasing trend of moving towards python for the sort of thing previously done by Labview
Python has all sorts of interesting packages such as numpy and USB wrapper packages making it a very versatile workbench.
And it's all free....
I want to offer my opinion on LabView from the point of view of an NI customer.
I came from chip-design background, so I had experiences with C, Verilog and many more design languages. My experience with LabView has been a blast, it was amazingly powerful and simple to use. One case in point, with LabView, I was able to write a program that can reliably communicate between two microcontrollers via CAN interface in one afternoon. I am not aware of any other development environment that can give me that fast of a development time. Not to mention that when I am stuck with any problem, I can call NI and have somebody in NI to help me out. Good software and support ain't cheap. I am a happy customer of LabView.
Paul has a point that many of NI's hardware are expensive. Part of the reason of the high hardware cost is that they are really reliable and capable. If you don't need those capabilities, there are some really inexpensive hardware (eg. $13 for Luminary Micro LM3S8962 uController) that LabView can run on.
Speaking of student accessibility, apparently LabView has addressed some of these concerns. According to this:
LabView is the "Core language used to develop Mindstorms NXT software."