I didn't write a BASIC interpreter, but I got as close as you can to that. BASIC is the reason I joined HP's fabled Calculator Products Division in Colorado. In college, I learned Algol. Couldn't stand it. Could not stand punched cards either. I could see that computers were going nowhere with batch processing. Then the university got an HP 9830 desktop calculator. You programmed it in BASIC. It had a keyboard, display, mass storage (digital cassette tape), and a thermal page printer. It was 1974 and it was clear this was the future of computing. I joined the HP division that made that machine when I graduated in 1975.
I did work on the HP 9825's HPL interpreter. Think of HPL as BASIC with the vowels sucked out. Meanwhile, HP BASIC began to rapidly advance. Labeled GOTOs, subroutines, and subprograms. Long variable names. Multidimensional arrays and matrix math. Advanced I/O. Instrument control. Interrupts handled at a high level. Graphics (monochrome then color). That all happened between 1972 and 1978.
Dartmouth BASIC was indeed limited, but BASIC had a long a useful life.
While I also had a trip down the (dark?) alley called APL, BASIC was the ticket to the micro world, thanks to the good folks at Parallax. From there, to the MicroEngineeringLabs PICBasic compiler, and then assembler and C (all pn PICs), it was a nice gentle introduction to microcontrollers. Yes, I had programmed Z80s in assembler, but BASIC brought me back to the embedded world.
I've always loved Basic. I've always found it pretty easy to learn and intuitive to use. Used early micro versions, Sinclair Spectrum (think it was Timex in US?) and the PC versions up to Quickbasic, in which I wrote a terminal emulator that worked very well.
Did the early versions teach you bad habits? Yes. Can you teach yourself good habits? Of course. I'm still very much an amateur programmer but still use BASIC - I've been having a lot of fun with PICAXE chips recently. The fact that they are available has probably been a factor in my not having learned C yet, so there is one thing you CAN blame BASIC for :-)
I too learned BASIC on a Sinclair -- yes it was the Timex Sinclair in the U.S. -- and later used on the original IBM PC. I agree that BASIC made it easy to learn bad habits, but it also enabled many of us to get into programming without the agony of late nights typing out punch cards at the campus computing center. BASIC served it's purpose way back when.
I'm and RF Guy but still I started with Fortran then dabbled in some C++ in college.
I guess since RF was easy for me with none of that black magic that other non RF people talk about, I was still so bored with the old analog control/interfaces on my Rf projects that I went for some real abuse and started to do some micro programming in the late 1990's to make my RF projects more modern.
I migrated to Microchip PICs in the late 1990s and started using a compiler called PICBasic Pro simply because it was cheap and it is an excellent basic language compiler with numerous resources available.
I have built everything with PICs from full blown RF transceivers with all of the fancy LCD displays to a PWM controlled AM transceiver with PWM bias control on the RF stages then when this wasn't enough I added a PIC controlled VSWR/ protection circuit.
Next was a full blown ham radio repeater controller that sets the frequency and even does antenna rotation control via DTMF tones all in Pic BasicPro.
For me not being a dedicated programmer I didn't have to stay with C and I found that the version of basic that I'm using is incredibly and extemely functionable.
I coded some of my projects in C, just to be fresh on my old programming, but still for my projects I always found myself going back to the PICBasic Pro compiler simply because it is reliable and easy to use with more than enough built in libraries for all of my RF needs.
Now this brings back a memory of a story about a colleague who needed to design RF bandpass filters, but all he knew was Basic (self-taught). The corporate computer was a PDP-something used mainly by the software coders.
This guy started runing RF filter design equations in Basic, and brought that PDP-? system to its knees with all the number crunching. The software folks were upset, whenever he ran his programs their response times went sky-high.
What did not help (or what DID help from his viewpoint) was he had discovered that a simple PRINT statement to output a single character in the middle of his routine would cause the time-sharing machine to devote much more time to his program than usual, at the expense of the software coders.
I have always been partial to BASIC when deveoping on a PC (or early PC like the PET). It has always worked well with minimal learning curve. I made the transition to Visual Basic and then on into the NET approach. Even though Visual Basic looks a lot like Visual C nowadays, I always will opt for Basic in the hope that the examples that I follow will be less obscure than C (or C++) code. This is especially important to me because I only program in this environment every 2-3 years and so by the time I come back the whole environment has changed and I have to re-learn everything.
I recently worked on a project for an Android tablet using the full Google environment, programming in Java. The whole experience was so disheartening that the next time I am working with Basic 4 Android.
You would enjoy this interview I did with Werner Haussmann in 2006. Werner wrote all the VB code in his article. It has links to all of his VB-related articles as well. I should feature his articles in a newsletter.
I wrote an article in 1993 called "Out of the Dark With Visual Basis where I borrowed a few data-acquisition boards and wrote some code. The article isn't online but I have the print version and can scan the pages.
Your articles and those of Mr Hausssman provided much motivation and inspiration for me. I have copies of some of them in the binder of my source material for the book. I used that inspiration in 3 of the chapters of my book on Excel for Electrical Engineeers where I described interfacing to a DVM, a signal generator and a vernier caliper all within Excel. There was a lot more VBA programming in the other chapters.
I wrote an article in 1993 called "Out of the Dark With Visual Basis where I borrowed a few data-acquisition boards and wrote some code.
Thanks I will get to it later. There is a book called "Visual Basic for Electronics Engineering Applications" by Vincent Himpe. The second edition is availabale as a free download. There is a newer edition although it is probably easier to get through Elektor.
I had a pretty good experience writing Applesoft BASIC. It didn't encourage writing great code, though. There were no labels, just line numbers. If you ran out of line numbers in one spot, you had to GOTO a differect section. (There was no real text editor and no cut-and-paste; one could only edit a line of code at a time, so renumbering code was too difficult.) There were a lot of GOTOs in general. I don't think you could pass variables to subroutines, so everything was global. Only the first two characters of variables were recognized. Comments were sparse because they ate up too much precious RAM. Anything requiring speed had to be written in assembly, POKEd in by the BASIC program, and called from there.
I used to work for a software publisher (Micro Lab) in the 80's and as I recall we had two tools, names forgotten, that would either take a file with symbolic labels and replace them with line numbers or we typically ran our code through a line number cruncher that would pack as many commands as possible to a line, while renumbering with an increment of 1. We did the crunching to make the code harder to edit by users and for performance.
Furthemore, I wrote a machine code library to hook into the Amersand hook to do disk I/O that was twice as fast as Apple's DOS command interpreter. The package was known as Language Plus, was poorly marketed and over priced to boot. The company insisted on selling it at $150, while I argued for $49.95.
We had other tools that I wrote that would print the listings one command per line. I liked drawing tiebars around If/Then and For/Next statements to analyze the code.
Modern Basic versions like VBA, VBScript, and Visual Basic make it possible to quickly and easily transport code between Word, Access, Excel, windows batch files, and web server ASP scripts. Code modules I wrote in Word '97 in 1999 are still used every day in Word 2010 templates, Access, VB programs, and our Intranet site.
Remember Learning Basic on a Xerox Sigma 7 Mainframe in highschool -- The connection was via TTY to the state university over 300miles away -- The Sigma 7 was also used for the Navy's A-4 Skyhawk Simulators at the flight schools.
Never had the oportunity to do embedded basic, but did do embedded PLM code once, before moving on to assembler and C and later C++.
There likely still is embedded Basic code writen with that compiler flying today somewhere in the world--
I worked at Wang Lab's while in school. My first exposure to Basis was on the Wang 2200 in late 72 or early 73.
Like others here I still use basic in VBA and Visual Basic. I find the old VB 6.0 still a whole lot nicer before it got microsoft'ed into its current form. Thankfully I still have some XP machines that keep it alive.
I've programmed in one dialect of BASIC or another for over 40 years and never had much trouble with the code. In fact, for parsing lines of text, it is still my go to (cough) language.
Certainly I've seen my share of spaghetti logic written by undisciplined programmers. We had taped to our wall at the software publishing firm I worked for, one egregious fragment as a warning to others that we had found in the code I was hired to clean up. (See http://www.rostenbach.com/programming_horror.gif)
But you can write structured code in BASIC if you do it right.
The problem are those that just want to hack out something and don't care what they are doing and you can find that in any language.
Actually, Microsoft's QuickBasic 4.5 and 7.1 were both also compiled. But they won't run on anything past WinXP except with a DOS VM, which slows them down.
But never fear, there are enough Basic fans that have kept the language alive. My favorite today is QB64, which is a compiled version of QuickBasic, fast, and runs just fine on Win7 and Win8, not to mention Linux and Apple OS, thank you very much. And QB64 runs just about any MS QuickBasic code with no changes, not even changes to the graphics, although it adds a host of new features.
It's easy to write proper code with these updated versions, just as it is in VB. And the nice thing is, most programmers, no matter what their preferred language, can read Basic code. Not all that different from Pascal, in this regard, only Basic still soldiers on, and Pascal?
David Ashton might be pleased to know that QB64 is hatched down under.
To answer the title, I'd say "bad rap." The trip has been fun.
I have been a fan of BASIC since I learned about computers way back in 1976
In the interpretive format it is so compact that in the later part of my career, I used that in our smart TV product ( developed in 1990) as a simple embedded computer for children .It was fun working with BASIC then.
BASIC was great for its time - a simple language that could get you going on many different systems. The old Apple ][ had a BASIC intrepreter - and a Pascal language card that languished on many machines it was installed in.
I used to compare BASIC and Pascal as 'do it your way' and 'do it my way' languages. I always chose the 'do it your way' - and still try to do that, even in C++ - but that's another story.
My favorite BASIC statement was actually 'COMEFROM.' Unfortunately this was not implementen in most dialects of BASIC. It would have been a much richer language if it had been.
I wrote a lot of code in BASIC, using Microsoft's semi-compiled QuickBASIC in the last few years I uaed the language. But that was after I was able to build my own machine and wasn't tied to the ball-and-chain of punch cards and job submittal statements surrounding the 'real' code.
like many posters in this forum my first language was FORTRAN, learned on a UNIVAC 1108 in the basement of the University of Wisconsin Computer Sciences Building - MACC as it was called then. We used Hollerith cards and I got pretty good at distinguishing an 'O' from a '0' - much better than the FORTRAN compiler we were using at the time, in fact.
The MACC basement was a large open area with two alcoves - one housed the punch card machines where you entered your world-beating (or at lease course-passing) FORTRAN code on Hollerith cards; the other alcove came to house a collection of DEC-Writers that allowed you to initiate interactive sessions with the UNIVAC - and spend a lot of real money fiddling around.
The really serious area of the basement was the table(s) reserved for people doing graduate work in CS. You could tell them by their long, scraggly beards, huge collections of card decks in boxes and the overflowing ashtrays. Each time I went down there I swear those guys (there were no women graduate students that I recall - perhaps I was blinded by the clouds of smoke) hadn't moved since the last time I was down there.
I wonder if they were still there when they remodeled the CS building some years back? Maybe they are inside the glass house, perserved like Chairman Mao or (formerly) Lenin?
I once saw an article about how to self-document your FORTRAN code by using a "COME FROM" statement, which is really just a Comment (FORTRAN comments start with a "C" in column 6, so you could type "C" or a word beginning with "C" in column 6 to designate that line as a comment), like using the keyword "REM" or the word "REMARK" for a BASIC comment (in the days before an apostrophe became common for comments).
My favorite FORTRAN statement is REAL MEAN.
(For those unfamiliar with FORTRAN, variables beginning with I, J, K, L, M, or N default to Integer representation, which is why they are used for loop counters. ALl others default to Real (floating point, with a decimal point). For the variable MEAN to represent a Real Number it has to be specified as REAL.)
When it comes to FORTRAN, my favorite was the mismatched COMMON
In one module:
COMMON REAL MEAN(2o)
In another module:
Link these modules into your gigunda program and see what happens.
Compilers (and programmers) weren't too smart back then so the first one defined an array of three REALS (the 'o' character was dropped in the first compiler pass, leaving a '2') and the second defined an array of 21 INTs. Any data defined after the first COMMON statement would get stepped on by the second. Much merriment and confusion all around.
Stargzer wrote: My favorite FORTRAN statement is REAL MEAN.
First, I feel compelled to point out that FORTRAN IV comments begin in column 1. Otherwise columns 1-5 are used for a numeric label for GOTOs and FORMATs. Column 6 is for "continuation", i.e., this card is a continuation of the previous statement.
Columns 73-80 are reserved for card sequence number so that if you drop your deck you can sort it back into the proper order, assuming that (1) you actually put sequence number in the deck, and (2) you can figure out how to use the card sorter.
The compiler ignores 73-80 and a common error made by newbies is entering a line longer than 72 characters. I kept an IBM flowcharting template handy since it had a ruler with inches and tenths so you could quickly see if a newbie had characters past 7.2 inches.
My favorite FORTRAN statement is REAL BUMMER which a frustrated student once included in his FORTRAN program according to an operator.
EQUIVALENCE is a lot of fun. You can use it to get around FORTRAN IV's restriction that arrays are numbered from 1. I used it to number arrays from 0 when writing a Fast Fourier Transform, which is quite nice in FORTRAN since the language has built-in COMPLEX numbers.
@betajet with the stern-looking icon: "First, I feel compelled to point out that FORTRAN IV comments begin in column 1. Otherwise columns 1-5 are used for a numeric label for GOTOs and FORMATs. Column 6 is for "continuation", i.e., this card is a continuation of the previous statement."
I sit corrected (I don't stand for much in our office environment -- chairs are so much more comfortable!). Some brain cell connections fade with age, which is why I keep books that otherwise might only induce nostalgia. I should have reached over to my bookshelf for my copy of Programming the IBM 360 by Clarence B Germain, the textbook from out PL/I course which also covered some basic FORTRAN, COBOL and Assembler, as well as some of the various peripherals for the 360. (There is a COBOL coding sheet still tucked away inside the book.)
"Columns 73-80 are reserved for card sequence number so that if you drop your deck you can sort it back into the proper order, assuming that (1) you actually put sequence number in the deck, and (2) you can figure out how to use the card sorter."
LOL! I usually used the shortcut, on smaller decks, of standing the deck up and drawing a diagonal line across the top. At least you could see an out-of-sequence card before you dropped it in the hopper.
"The compiler ignores 73-80 and a common error made by newbies is entering a line longer than 72 characters. I kept an IBM flowcharting template handy since it had a ruler with inches and tenths so you could quickly see if a newbie had characters past 7.2 inches."
Somewhere I still have the steel ruler with numbers every 0.1" to check spacing on printouts, as well as a later plastic version with a yellow highlight strip down the center to make one line stand out when measuring spacing.
"My favorite FORTRAN statement is REAL BUMMER which a frustrated student once included in his FORTRAN program according to an operator."
I like that one! A redundant statement but still all too true at times! It reminds me of the time in Piss Chem Lab when our Inversion of Sucrose experiment wasn't going well. When my lab partner came back from making the carefully measured sucrose solution for the 3rd or 4th time he had the volumetric flask labelled SUCKROSE!
After we broke the stem off the gas cell for the IR in a late-night make-up lab, I sometimes wonder if they graduated us just so they wouldn't have to see us again!
I spent a lot of time at MACC from 1974-1978. I was an ECE ugrad, but hung around MACC mostly because (unlike Engineering at the time) there were female students there. I think I was a bit after sixscrews' time because we transitioned to the UNIVAC 1110 fairly early on. We still had the FASTRAND movable-head magnetic drum memory, which The Devil's DP Dictionary describes as "a device for storing angular momentum".
Actually, I spent most of my time at the Computer Systems Lab, which was down the hall from the Big Room with all the tables. CSL had the PDP-11, which was "love at first byte". You could sign up for it one hour at a time. Students taking classes that used it had higher priority, but if you had flexible hours you could get enough PDP-11 time to have a lot of fun. Besides, most of your time was keypunching ASM programs so you didn't need that much time actually on the PDP-11 if you were a good programmer.
During one of the years I was there, the computer nerds who never had dates on Saturday night would go across the street to Union South and watch Space 1999 on a communal TV. Space 1999 was on a local station at 10:30 PM. If you've never seen it, it's a British science fiction series that took place on the Moon, which had been knocked out of orbit in 1999 due to a disaster on the far side. Space 1999 had excellent special effects, good actors including Martin Landau and Barbara Bain, and absolutely awful writing which took itself way too seriously. So we watched it for laughs, and there were plenty of those.
AH - the off-site computer systems labs at UW-Madison. ECE had a Harris installed in the basement of their 'domain' back in 1978-79. It was run by graduate students ('nuff said). It had a card reader and a number of 'interactive terminals' (a good come-on line to your date, if you were careful). That Harris was not the most reliable machine in the world and had a pocket watch hanging inside the main cabinet.
After a few nights working with this machine I learned the following:
If the card reader and 'interactive terminals' stopped working at once, look into the glass box at the grad students. If they were consulting the pocket watch and typing on the console keyboard, you could stick around for a re-boot and, perphaps get your job done before the crack of dawn. If, on the other hand, they opened the drawer in the desk and got out the soldering iron it was time to go home and open a bottle of Golden Glow as the night was done.
I'd thought that he'd written the Applesoft floating-point BASIC, too, but that actually came from Microsoft. The Wikipedia page says that Microsoft licensed it for 7 years for $21000. I would feel sorry for them missing out on all that big Apple money in the 80s, but things turned out all right for Microsoft.
Freshman year of college, the fall of 1969, I managed to get into a FORTRAN course, followed by a PL/I course the next semester, both on the college's small IBM 360/25 mainframe. Somewhere in between I learned BASIC - Beginner's All-purpose Symbolic Instruction Code - on an ASR-33 dial-up link to the Dartmouth Time Sharing System (DTSS), the home planet for BASIC. I can't seem to chase it down, but I think the book put together by Kemeny and Kurtz was call "BASIC Programming." It contained small programs showing how the language could be used both in science classes and in the humanities. It was, after all, designed for beginners and for both science/math majors and for non-science majors.
One of the several jobs I held after graduation was teaching computer programming to 4th, 5th, and 6th graders at a private school. We had a TTY link to a local time-sharing system. They had a BASIC program that was a text version of a Star Trek game, so I listed it off and kept a copy of it (I swear, it's somewhere in the basement to this day, along with several boxes of IBM punch cards, including a digitized version of "Nude On A Stool" that used different print characters to shade the picture to a grey-scale representation. It's an object module ready to go as soon as I find a System 360 with DOS ... .).
Several years later I visited a plant in Florida (I think it was Racal or Racal-Milgo) that made an automated tech control system that could be based on a DEC PDP system (10 or 11?) or an 8080, depending on the size of the network you needed to control. The 8080 version had one alarm signal programmed to play the tune of the watch that would wake up the agent in the movie "Our Man Flint." It wasn't written in BASIC, but the plant's mainframe did have a BASIC compiler ...
Now, BASIC can be somewhat portable as long as you don't use PEEKs and POKEs or other special implementation-dependent instructions. I gave one of the engineers a listing of the Star Trek program and not too long after that the sales rep told me that a plant-wide directive went out that there would be no Star Trek during the day shift -- it was slowing down the big PDP system!
I once read that those who learned on early languages such as FORTRAN, COBOL, BASIC, and PL/I had a hard time adjusting to Object Oriented languages. Sadly, I have found this to be true. :- (
Stargzer wrote: I once read that those who learned on early languages such as FORTRAN, COBOL, BASIC, and PL/I had a hard time adjusting to Object Oriented languages. Sadly, I have found this to be true. :- (
Personally, I suspect that those who didn't learn these languages also have a hard time adjusting to OOP languages. A FORTRAN programmer is used to telling a computer what to do and having it do it. The OOP challenge is figuring out what OOP hoops you must jump through to accomplish this. People who are starting which OOP languages don't have this challenge. Instead, they're faced with so many levels of abstraction that they haven't a clue as to what is going on.
Personally, I think Pascal is a great first programming language. You get structured control, procedures, clean data structures, and the discipline of declaring all variables before using them. Pascal was designed to tech good programming practices from the beginning.
(No, Pascal was not my first programming language. That honor goes to assembly language and FOCAL.)
@betajet: "Personally, I suspect that those who didn't learn these languages also have a hard time adjusting to OOP languages"
I eventually worked part way through the jungle. A few years ago I had to learn Tcl (Tool Control Language) to run reports against a Unix system. I finally got to the point were I could solve most of my errors on my own, and even found a solution online using a technique that even the Guru who taught me didn't know. Once I got good at it, they gave the task to a contractor and transferred me to another area.
"Personally, I think Pascal is a great first programming language. You get structured control, procedures, clean data structures, and the discipline of declaring all variables before using them. Pascal was designed to tech good programming practices from the beginning."
I learned the value of declaring variables and structures in PL/I. PL/I had both the virtues and the vices of FORTRAN and COBOL. It could be as easy to code but non-self-documenting as FORTRAN but as self-documenting as COBOL.
For example, as in FORTRAN, PL/I variable names starting with I, J, K, L, M, and N defaulted to Fixed Decimal (15); all other defaulted to Fload Decimal (6) (@betajet: This time I did pull my old DOS/TOS PL/I D Language Reference and Programmer's Guide manuals from my bookshelf!). So for a quick and dirty program it was easy to use FORTRAN rules for names. In COBOL, I recall that variable declarations were mandatory; in PL/I they were optional, and one could not only delcare variables and arrays but also structures, which could be used to read a record containing different types of data into a single structure. It was easy to be organized and self-documenting in PL/I if one so desired. ;- )
Maybe 30 years ago I was in a Pascal course for a system that was being delivered to us. I remember the instructor telling us that the first byte of a string set the length, but we had to use a function to change the length. I said, "Why can't I just stomp on the first character to change the length? It's more efficient!" "Because you wouldn't work for me if you did that!" was his reply. "Even if I document it?" "Yes!"
I never had a chance to complete the course, because after a couple of days they hauled me out of the course to work on a datacomm hardware problem for the new system.
A short while later I asked the instructor why the sliding alpha test for the line printer was so slow -- it couldn't even drive a 600 LPM printer at 300 LPM. He said the programmer was regenerating the line on each pass. I asked him why he didn't just create the line, glue of copy of itself to the end, and step across it with a substring loop. He replied that it was written by a trainee. My first thought was, why are we paying to train your programmers?
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.