I didn't write a BASIC interpreter, but I got as close as you can to that. BASIC is the reason I joined HP's fabled Calculator Products Division in Colorado. In college, I learned Algol. Couldn't stand it. Could not stand punched cards either. I could see that computers were going nowhere with batch processing. Then the university got an HP 9830 desktop calculator. You programmed it in BASIC. It had a keyboard, display, mass storage (digital cassette tape), and a thermal page printer. It was 1974 and it was clear this was the future of computing. I joined the HP division that made that machine when I graduated in 1975.
I did work on the HP 9825's HPL interpreter. Think of HPL as BASIC with the vowels sucked out. Meanwhile, HP BASIC began to rapidly advance. Labeled GOTOs, subroutines, and subprograms. Long variable names. Multidimensional arrays and matrix math. Advanced I/O. Instrument control. Interrupts handled at a high level. Graphics (monochrome then color). That all happened between 1972 and 1978.
Dartmouth BASIC was indeed limited, but BASIC had a long a useful life.
I've always loved Basic. I've always found it pretty easy to learn and intuitive to use. Used early micro versions, Sinclair Spectrum (think it was Timex in US?) and the PC versions up to Quickbasic, in which I wrote a terminal emulator that worked very well.
Did the early versions teach you bad habits? Yes. Can you teach yourself good habits? Of course. I'm still very much an amateur programmer but still use BASIC - I've been having a lot of fun with PICAXE chips recently. The fact that they are available has probably been a factor in my not having learned C yet, so there is one thing you CAN blame BASIC for :-)
I have always been partial to BASIC when deveoping on a PC (or early PC like the PET). It has always worked well with minimal learning curve. I made the transition to Visual Basic and then on into the NET approach. Even though Visual Basic looks a lot like Visual C nowadays, I always will opt for Basic in the hope that the examples that I follow will be less obscure than C (or C++) code. This is especially important to me because I only program in this environment every 2-3 years and so by the time I come back the whole environment has changed and I have to re-learn everything.
I recently worked on a project for an Android tablet using the full Google environment, programming in Java. The whole experience was so disheartening that the next time I am working with Basic 4 Android.
You would enjoy this interview I did with Werner Haussmann in 2006. Werner wrote all the VB code in his article. It has links to all of his VB-related articles as well. I should feature his articles in a newsletter.
I wrote an article in 1993 called "Out of the Dark With Visual Basis where I borrowed a few data-acquisition boards and wrote some code. The article isn't online but I have the print version and can scan the pages.
Your articles and those of Mr Hausssman provided much motivation and inspiration for me. I have copies of some of them in the binder of my source material for the book. I used that inspiration in 3 of the chapters of my book on Excel for Electrical Engineeers where I described interfacing to a DVM, a signal generator and a vernier caliper all within Excel. There was a lot more VBA programming in the other chapters.
I wrote an article in 1993 called "Out of the Dark With Visual Basis where I borrowed a few data-acquisition boards and wrote some code.
Thanks I will get to it later. There is a book called "Visual Basic for Electronics Engineering Applications" by Vincent Himpe. The second edition is availabale as a free download. There is a newer edition although it is probably easier to get through Elektor.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.