I've been using Octave for Galois Field arithmetic and I'm really happy with it. For some reason the script I wrote most recently takes 1/20th (!) as long to run on Octave than on Matlab. (It operates on BCH codes.) (I'm using Octave on the Mac.)
I transitioned from Excel to Python for running basic calculations and simulations a few months back. And it works great! Tons of great modules(matplotlib and numpy are a must) and more than anything its fast as hell. And by fast I mean 50 million points in a few secs and it can plot them without stuttering too!
True, it's not exactly a simulation tool, you've got to spend a little time thinking about how to run the simulation but once thats out of the way, it's all smooth sailing.
PS: I used PSoC Creator before and the line "which allows users to design their own Programmable System on a Chip and have it manufactured (that part costs)" is misleading. You can buy a generic PSoC 3/4/5 and program it with Creator i.e., it's all firmware(although that term actually encompasses a lot more when looking a PSoC) and no custom manufacturing is involved. The cool thing is you can always move stuff around and reprogram so it's pretty much like a custom SoC.
An alternative like this for excel is very good for the engineers. I always felt that engineers need more and should be able to upgrade it themsleves without calling any customer support or getting into copyright issues. I started using Ubuntu and in last three years it never crashed and so easy to operate and has so much flexibility.
You could add PARI/GP from France for advanced algebraic analyses, it's actively maintained, and also goes well beyond what I understand.
I have a question, though: in 2000, I was using a program that could fit a dataset to a function optimally ( least-squares, outlier reject etc.) by trying hundreds of algorithms, e.g. series, logs, ratios, trancentals, relations among them. Do you know the name of that program?
While there are some great tools in this and the original list, I am left scratching my head as to why LTspice failed to make the list again. I use it very often and is at the top of my list of free tools.
Another nice tool is the free version of Mathcad. When you let the free 30 day trial lapse. You get a very stripped down version that is still quite useful as a math scratchpad that handles units. Very useful for free.
Finally, if you are going to included tools tied to vendor's parts like Cypress' PSoC Creator and TI's WEBENCH, you should included the free versions of Quartus from Altera and ISE WebPack from Xilinx.
It's a bit iffy to classify R as "an alternative to Excel". R is not a spreadsheet, it never has been and never will be...and Excel has nowhere near the data manipulation capabilities of R. There is very little overlap between their capabilities.
As a design engineer, I find Excel useful about once every three to four years, usually for things like BOMs, but I use R almost daily for REAL engineering work. R makes short work of anything that my trusty HP calculator can't easily handle. It was designed for statistical work (it traces its roots back to Bell Labs' 'S" language in the 1970s, and has seen a great deal of refinement over the years, but it's more properly presented (in my opinion) as a "mathematical Swiss Army knife".
I use Excel or its Libre Office equivalent occasionally. It can be a terrific way to present certain kinds of information, and also do a quick analysis. It's also nice that you don't have to use it every day to maintain proficiency.
Some of my uses:
1. I used it to help design an FPGA-based Baud Rate Generator. Excel was a great way to calculate divisors and prescaling for standard Baud rates, and calculate the frequency error.
2. It's a great way to work up an FPGA pinout. Xilinx provides generic pinouts for their parts and you can add your own signal names and notes. Xilinx Excel files show which pins are dedicated to power, ground, and pre-defined functions so you can avoid them for user I/Os.
@ antedeluvian the use of Excel in Electronic Engineering
There was a time I did not know how to use Excel, then I needed to calculate PCB dimensions for impedance control of broadside-coupled differential stripline. The textbook equation was a real pig, but learned how to use the Excel goal-seek function to automatically iterate guesses until it found the right one. The prototype boards measured within 2% of the target.
@davemcguire Thanks for the insight on R. But isn't it true that many of us try to use Excel for something it's not equipped to do because we are not aware of a good alternative? Or too lazy to learn. :-) Would love to know more about your history with R, how you learned about it, what types of things you use it for, wow I think I am proposing you write up a blog on it! :-)
Well yes, a lot of people abuse existing tools due to inertia or just plan laziness. That's a bit of a shame, but our work ethic isn't what it once was.
I myself stumbled upon R in a search for a good math package. I had used Octave a bit, and mostly liked it, but being a fan of Lisp, I was immediately drawn to R's simiarly to the Scheme language. Specifically I needed to do some curve fitting to linearize the response of a sensor, and come up with coefficients to embed in firmware. R made short work of it, and it made so much sense, that I've been hooked ever since. I've since written an entire suite of instrument control software with it, so I go straight from lab equipment into data structures in R, complete with graphing if needed.
And I don't have to fight with Windows to use it. ;)
A lot of people use Excel (or an equivalent; I use the name generically, like 'Xerox") for the type of curve fitting I needed to do that day, or other related stuff, and if that tool works for them, that's great. The R approach better matches the way my mind works, and the productivity boost has been great for both myself and those who depend on my work. It's not for everyone, but give it a try, you might like it. The learning curve is steep, like any other complex tool, but it's worth it.
Well, a lot of people "abuse" existing tools because they don't have time to learn 100 different tools, also. It's not "lazy" to realize that there is a finite amount of time available in a day (or a project) and to cope accordingly. If my employer allowed me time to get up to speed on all the "most appropriate" tools for any given project, I would be learning different tools all the time instead of doing my actual job :-) because I don't do just one kind of work day-in and day-out.
So there is an actual engineering tradeoff in the tools space: learning a few very flexible tools in-depth, and using them for most tasks...or learning lots of different, specific, focused tools superficially. I suspect most people are somewhere in the middle.
Quite Universal Circuit Simulator (QUCS) is a great little Spice-like open-source tool that runs quickly, has a wide range of components, rather intuitie interface (but a so-so manual), and will execute S-parameter analysis. Available at qucs.sourceforge.net/.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.