What’s wrong with Basic?

Spread the love

I have yet to see a computer language that offers anything not available in Basic, in so far as the language itself goes. But Basic has been maligned as the ruination of computer coding. There is an alternative opinion out there.

Have you read the breakthrough novel of the year? When you are done with that, try:

In Search of Sungudogo by Greg Laden, now in Kindle or Paperback
*Please note:
Links to books and other items on this page and elsewhere on Greg Ladens' blog may send you to Amazon, where I am a registered affiliate. As an Amazon Associate I earn from qualifying purchases, which helps to fund this site.

Spread the love

73 thoughts on “What’s wrong with Basic?

  1. Pffft, Basic?

    I have yet to see a computer language that offers anything not available in Assembly, in so far as the language itself goes.

  2. I’d suggest checking out C, Perl, Python, and Lisp for some features not available in Basic.

  3. “I have yet to see a computer language that offers anything not available in Basic”

    Pretty much any language in wide use today has features the old Basic didn’t have. To mention a few: variety of control structures, dynamic memory allocation, OOP, scope, higher-order functions, closures, first-class functions, continuations, pointers/references, data structures, macros, modular programming, multimethods, non-local exits.

    There are much better languages to choose from today, so why no use those instead? If you want something as dumb as Basic, you might as well just learn an assembly language. It fits all the benefits listed by the author of the linked article, but has the advantage of being useful in certain kinds of work.

  4. Nothing. Nothing is wrong with BASIC. It will do the job you want. Also, as a learning language, it’s a lot more intuitive than assembly, even though the practices are essentially the same.

    Like the author said, it’s good to need to keep in mind what all of your variables are doing. It’s good practice for multithreaded programming, in fact. Also, IMHO, it’s good practice for programmers to implement a recursive algorithm without the crutch of functions.

    Would I use it? No, because I like functions and local variables. They’re too convenient for me to set aside. So I stick with C, C++, Java, etc.

  5. I have yet to see a computer language that offers anything not available in the lambda calculus. Three constructs: variables, functions, function application. You don’t need anything else.

    For a good first language, I’d go with Scheme, or maybe Python. Languages like C or C++ are terrible first languages since the compiler will just let you make stupid mistakes (esp. memory and type errors) and your only indication of a problem will be corrupted data, or if you’re lucky, a seg fault.

  6. BASIC is terrible, but it still works as a programming language. I think a powerful argument can be made for not disrespecting those that used BASIC in the past, but to recommend BASIC today is ludicrous.

    As a kid with a low budget and non computing family, various versions of BASIC were all I had access to. C compilers were very expensive, as were instruction manuals, but BASIC books were easily found and cheaply acquired. Assembly was available on my old Commodore PET alongside its built-in BASIC interpreter, but I had no clue where I could start figuring it out.

    Back in high school, I wrote a Population Genetics program full of tools and examples completely in QBASIC. It was a 91Kb nightmare of balancing variable and size constraints, continuously reaching artificially imposed limits and wishing I had capabilities which are now standard in most languages.

    Today, I wouldn’t wish BASIC upon my enemies. I had to unlearn so many quirks of BASIC when moving to more useful languages. Use something easier, more practical, and fully featured instead, such as Python or Java, both of which are excellently documented and have good built-in debugging feedback.

    In short, using BASIC may have been okay 10, 20, 30 years ago when there was less choice, and lessons then learned are not wasted, but there is no excuse for teaching or even using BASIC today.

  7. You might want to check out REALBasic. It’s freaking awesome. Highly OO, I don’t know what “reflection” means in this context, and pre-emptive multithreading is, for a lot of people, unnecessary. Even then you can spawn helper apps to approximate such behavior.

  8. The problem with BASIC as a very first language is not that it offers too little, it’s that it offers too much—too much structural flexibility. As a result, it encourages spaghetti code.

    That’s not to say that a good programmer can’t make use of some anarchy to good effect, only that it should be something used with deliberation.

    My own experience was to be introduced to programming through Scheme, with only recursion, local immutable variables, and if-then-else branch statements as tools. It seemed much easier for me to pick up loops, state, objects, etc, than it was for other students to give them up.

    Although, I’d be happy if someone could figure out how to prevent people from writing FOR-CASE structured code. WTF is up with that?.

  9. Real-time performance.

    And, I do mean *real-time* not just “fast enough if it’s ok to drop a few frames of data now and then”.

    Real-time as in “if the anti-skid brakes let one wheel lock up, Greg loses steering and he, the missus, and the offspring, slide into oncoming traffic” real-time.

    Greg, you really should stick to what you know, which does not include computers.

  10. All the above are Turing-Complete languages, including BASIC. This means that anything that can be written in any one of them can be implemented in all of them. The differences come by way of aesthetics in program structure and runtime efficiency. Which also translates into programmer efficiency. So Greg’s comment stands.

    BUT

    Reflection and introspection specifically: when not presented as part of the language (e.g. C, C++, BASIC, Pascal,…) you can only get it by writing your own programming and structures paradigms and interpreting them with the base language. T-C still holds, of course, but in these cases it holds because any language can be used to write an interpreter for any language (including Turing Machines themselves, from there on it’s turtles all the way down … find a copy of vturing.exe to play with if this looks like fun), so you can choose what primitives you need that way.

    So what language to use, traditionally an almost religious issue, really just comes down to personal taste and problem resolution description match. Just like what editor to use to write it with. On what OS. Using what compiler / debugger.

  11. “There are OO structures, functions, and scoped variables in Visual Basic. And besides, those are all just crutches.”

    As the linked article stated, we aren’t talking about VB here, but the Basic that Dijkstra was complaining about. The language features that Basic lacks are the features that make the difference between 1000 lines of code and 5 lines of code, and the difference between 10,000 hours of maintenance and 1 hour of maintenance.

  12. IMO BASIC has a place in programming. If you want to quickly write a 20 line program to do some simple task, BASIC is great. Fast to write, easy to debug, unfussy about syntax and structure.

    Anyone who’s spent an entire day picking apart C++ code to find the obscure bug – in a program which you may only need to run once for ten seconds – knows the value of languages that allow quick-n-dirty coding.

  13. Why is it that discussion of programming languages always degenerates to the best first language to learn? Granted, the linked article is about the first language but I’m making a more general observation about any programming language discussions.

    As a point in general, there is no one best language. It’s folly to think and/or pretend there is. I wish more people move away from this thought to the realization that some languages are better for some things than others.

    C is good for low-level stuff that’s just a margin above assembly and empowers a large degree of fine control, which is why a lot of other languages are implemented in C.

    Python is good general language that I often use for a step above shell scripting to a self-standing web server. But it’s lack of compile-time checking can cause problems in large projects where it can take minutes or hours to get to changed code.

    PHP was specifically written for use with HTTP but it lacks consistency as a language.

    .NET is a nice way to integrate different language paradigms into one program (I have, for example, written a string parser in F# that is used by C#).

    Squeek/Newsqueek/Go are exceptional in places that can use channels.

    [HTML] Javascript is light-weight language that is good for small stuff but, until recently, suffered horribly at bad performance. It lacks a sufficiently good base library that necessitates jQuery and all the other “libraries”.

    I could go on.

    As far as language features go.

    Coroutines are neat and I wish more languages had it (iterators are often a weak form of coroutines). I don’t know how you could fake them if the language doesn’t support them.

    Multiple dispatch kind of goes with coroutines, but not. Cannot fake this if you don’t got it.

    Function overloads. Cannot fake this if you don’t got it.

    Functions with variable arguments. C’s support for this is pretty horrendous and, of course, not remotely type-safe. Cannot fake this if you don’t got it.

    Custom iterator support. I am amazed at how much of what I code involves iteration. The harder iteration is, the less concise the code.

    Integrated list support makes things easier in general. Python has this built in. (Ditto for dictionaries.)

    List comprehensions kind of add to having integrated list support.

    OOP. It’s almost as much of a blessing as it is a bane for the simple reason that if you don’t know good practices for OOP then you can make an absolute mess of organization. Anyway, OOP support can be faked but it’s not as pretty.

    Generics, which are similar but not the same as C++ templates. You absolutely cannot fake this if the language does not support it.

    Lambdas and anonymous methods. Another you cannot fake. The half-support of lambdas is one of my annoyances about Python.

    Reflection is a nice feature but doesn’t depend on the language except for being able to use the types and methods directly.

    First-class methods go a bit with reflection. Something I dearly wish C had.

    Pass-by-reference. Cannot fake it if your language doesn’t have the concept or the support of references.

    There’s more but I’m drawing a blank at the moment.

    I know there are some modern-ish basic languages (like VB.NET) that has a number of these features but I don’t know of *any* language that has them all.

  14. LightningRose: Troll warning. I’m quite confident that I know 10 times more than you do about computers. And I’m not going to tell you if that 10 is binary, decimal, or hex.

    Chris: Fair enough, but I was talking about basic, yet, I admit my concept of what I was talking about did shift as I wrote the post and since.

    Personally, I use programming for limited purposes and I have limited needs, and I did cut my teeth on basic. All the complicated stuff I have done has been in Basic, dBase/Clipper or Fortran. Those are languages designed for someone to quickly manage data, numbers, statistics, etc.

    Modern programming languages are for programmers, and they can be pretty amazing. But there is almost nothing about any of these programming languages that has ever been anything but a hinderence rather than a help for me, and this is true for a large class of people who are not programmers, but who do program.

    Honestly. If I want to use each of several of the numbers in a filtered series of data to calculate another stream of data do do something with, I would strongly prefer a basic algebraic statement. Basic gives me that, nothing else gives me more. I can write the same exact code (give or take some esoteric punctuation) that would run in any of the languages mentioned above. Given that, I would prefer the language that gives me the results with as little more beyond that code as possible.

    But not perl.

    Also, how do I know if any of you guys know what you are talking about? Are you up for a computer programming challenge? I’ve got a good one for you. I’m not sure if you can handle it, though. Let me know if you want it, I’ll post it. I’ve solved it in Basic. I’m thinking of writing an assembly lang. program do do it.

  15. I’d love to see the challenge.

    Admittedly, I don’t know what I’m talking about, I just have some opinions about what I like (someone did see fit to pay me a salary to write software, which I suppose is something. But then I saw some of the things that people have been paid to program, and decided that wasn’t too much of an endorsement).

  16. I have yet to see a computer language that offers anything not available in Basic, in so far as the language itself goes.

    Then you haven’t seen much. Try writing a lambda function in BASIC. Or a simple functional program. It can’t be done without writing an implementation of a functional language in BASIC first. Or create an inheritance based object structure to simplify an organizational problem. Or do a quick and dirty one line program to replace all “Gregg” with “Greg” in a directory full of documents.

    Even when BASIC made its first appearance it was a highly limited language, omitting a lot of very powerful and useful paradigms for the sake of being possible to implement on incredibly weak hardware. Today many of those paradigms are being used by non-programmers on a routine basis, for an immense gain in both productivity and in how sophisticated problems can be solved with comparable effort.

    If all you’re doing is applying maths to a chunk of numbers, then of course no other language offers anything substantial over BASIC (although if you ever tried Numerical Python you might change your views on that as well), but that’s something you might want to do in an Excel macro instead anyway. As soon as you have to solve involved problems, the limitations of BASIC very quickly get in your way instead of being a relief.

  17. Paul Graham, a proponent of Lisp, wrote an essay about how programmers won’t miss features they don’t know. He describes this as the »Blub Paradox«, taken from from http://c2.com/cgi-bin/wiki?BlubParadox

    As long as our hypothetical Blub programmer is looking down the power continuum, he knows he’s looking down. Languages less powerful than Blub are obviously less powerful, because they’re missing some feature he’s used to. But when our hypothetical Blub programmer looks in the other direction, up the power continuum, he doesn’t realize he’s looking up. What he sees are merely weird languages. He probably considers them about equivalent in power to Blub, but with all this other hairy stuff thrown in as well. Blub is good enough for him, because he thinks in Blub.

    Point in case, many programmers first look at string pattern matching with regular expressions as undecipherable noise. But having written many string parsers in many languages, I’ve been using RegExp’s for 5 years instead. Now I think of all that wasted time before I »grepped« regexes.

    Same with first-order functions, lambdas, Lisp macros, and the concept of channels and goroutines in Google’s new »Go« language.

    Anyway, thanks for the opportunity to promote the mother of all Wikis, the Portland Pattern Repository (http://c2.com/cgi/wiki).

  18. There are OO structures, functions, and scoped variables in Visual Basic. And besides, those are all just crutches.

    No. Sorry, but NO. OO, functions, scoped variables are not crutches – they are part of programming paradigms, and if you feel that they are unnecessary, I’d hate to work with any code that you’ve written (working with in the sense of having to re-write or debug, not run).

    Programming paradigms might seem like fads, but they serve a very important purpose – allowing the programmer to redefine the problem into something which can be programmed in an easy, clear way (or at least easier, clearer way). This is why different programming paradigms should be used to solve different problems.

    Going back to programming languages – most of us who write code for a living doesn’t just use a programming language, we use an entire framework. Currently the two biggest, and to my opinion overall best, are Java and .NET, but for domain specific problems, other frameworks (or sub-frameworks) are sometimes better.

  19. For me the chief problem with BASIC is that there is no standard, and that means no cross-platform compilation and a whole host of other goodies. Another problem was that it seems to have been assumed that only beginners would ever use it – but to be fair that was overcome by providing a means of invoking assembly routines – which was great if you had an assembler because then you can patch through to C libraries. Microsoft’s Visual Basic has stabilized quite a bit and can probably be used as a programming language with reasonable confidence your code will be usable in the future – but that limits you exclusively to MS Winduhs. Folks like me use C most frequently and typically write code which can be used on numerous systems (tweaking the code for MSWin is usually one of the more annoying tasks).

    @DrMcCoy: I still write many thousands of lines of assembler code each year for various processors. It ain’t dead yet – despite the fantastic C compilers which are (sometimes) available, when you really need to push that hardware you sometimes get forced back to assembly. Unfortunately it’s no joke that it can take a month to write about 2 days of the equivalent C code. I’m currently working on a program that’s got 3k lines of code and will likely end up with 16k lines – great fun – especially the debugging. The job does have an air of nostalgia though since it involves the Freescale MC9S08 series (formerly from Motorola). These are mutants of the 6502, so it’s a bit like programming an ancient Apple – except that the tools literally run thousands of times faster) and the computer itself runs at 48MHz rather than 1 or 2 MHz. It also has a gazillion peripherals built in and consumes a mere 90 milliwatts of power. I can have the equivalent (well, better) of an Apple2e running on a wristwatch battery for days! (or weeks, depending on the battery)

  20. @LightningRose: My granddad did better – when he worked on the design of the anti-skid system for the 747 it was a purely analog system. There was no problem with real-time performance there. Besides, if you think any anti-skid system requires real-time performance (in the various senses in which it is used by ‘puter geeks), it is you and not Greg who don’t know what you’re talking about. Well, unless you clock the embedded computer at, oooh, maybe 32KHz.

  21. @kapitano: I use C almost exclusively for all my quick and dirty programming, especially since the ‘quick and dirty’ bits often end up as part of one monstrous blob of scripts and other programs doing something useful.

    Programming languages are mere tools; it doesn’t make sense to say there’s something inherently wrong with any particular language. C was poo-pooed early on (especially by PASCAL fanatics) but low level hardware control is much easier with C because of fewer restrictions. Many things C could do, PASCAL simply couldn’t do without non-standard extensions. I saw some oft-touted virtues of PASCAL as being a rather severe limitation to me, so while some people would say “who wouldn’t want X?” I’d be going “who the hell would want X?” – and yet all those inventions do have some uses.

  22. Oh, *THAT* BASIC! I really don’t see the big deal at all. Variables limited to 2 significant characters? Globals everywhere? Pffff. FORTRAN folks, FORTRAN! And in the days, we didn’t have much computer memory and storage was incredibly expensive (or else tedious and a nuisance if you used the punch cards or paper tape). In those days, code in any extant language tended to have mysterious 1-4 letter sequences. You need only look at FORTRAN code from that era (1960s – late 1970s) and you’ll see the same – it’s easy because there’s a hell of a lot of FORTRAN code from that era which is still being compiled and run on computers all over the world today.

    As for other things like variable scope – those are disciplines implemented by the language rules and the compiler. When you get down to the raw machine code there are no such things. As already pointed out though, those things can be very useful indeed and really speed up the programming job.

  23. It seems that my usage of programming is not too different from Greg’s, in that it is merely one tool at our disposal, and one not always applicable, but is something we might dabble in. I’ve been writing bits of code off and on since I was a small child, but it was never a major aspect of what I was doing at the time. Plus, not having access to a computer from the early 80s until the late 90s meant that I was perpetually out of any loop. Sure I could hack some code out at a friend’s place that happened to have had a 386, but that was never enough to really call me a programmer. When I hit college, I took a class in programming that used PASCAL…actually Think PASCAL, if I recall correctly. And I distinctly recall thinking, several times, that whatever we were instructed to work on could be done with a third of the code in Basic. I realized of course that that was entry level stuff and that quite likely some monstrous code would be even more monstrous in Basic, but it seemed absurd to me to “bring an elephant gun to hunt frogs”. As Greg mentioned (or was it someone else? ma ii), it’s a matter of bringing the right tool to the job, paraphrased. Ideally, any programmer would be familiar with all languages and be able to suss out which approach would best suit a particular task. Also ideally, all such languages would be able to work with each other and in any environment.
    But I will readily admit that I’m still an amateur, if a long-time amateur, and might perhaps possess an overly simplistic and/or naive view.
    Essentially my two cents, boiled down to a frothing mass of molten copper and other sundries through copious use of thermite, is that no language is perfect (ask Godel) and that no language is inherently better or worse than any other until put it into the context of a given task.
    To summarize the summary, Language X is just fine for what Language X is just fine for.
    end

  24. BTW, on your “Linux is teh shitz” campaign: ATA TRIM support, necessary for efficient use of SSDs, and which has been available under Windows 7 for a while now, will finally be incorporated into Linux with kernel 2.6.33.

  25. I have to say, I wonder how the blogger would react to someone stating “I have yet to see a tool in anthropological biology that offers anything not available in comparative anatomy, in so far as the tool itself goes”. Hopefully he would conclude that the one writing something that ignorant was not qualified to comment.

    And that’s exactly what the comment “I have yet to see a computer language that offers anything not available in Basic, in so far as the language itself goes” sounds like. It’s not even wrong, it’s that ignorant. There is probably not even a single class of Computer Science behind that astonishingly naive observation.

  26. I think most of the comments above lose sight of the fact that basic, in any form, is turing complete and therefore as capable as any other language. All the comments speak to its expressiveness and ease of use in specific domains, but greg’s comment still stands: it can in fact do anything you need it to do provided sufficient time and attention.

  27. Troll warning. I’m quite confident that I know 10 times more than you do about computers. And I’m not going to tell you if that 10 is binary, decimal, or hex.

    You asked a question, and I gave you a response which you apparently can only respond to by challenging moi to a dick waving contest.

    Very well, let the waving commence.

    I wrote my first (BASIC) computer program in 1975.

    I wrote my first assembly language program in circa 1977. I’ve since written assembly code for PDP-11, VAX, 6502, 68xxx, x86, PPC, MIPS, and ARM processors.

    My first professional programming gig was writing code for real-time aircraft flight data acquisition and ground display systems for NASA in FORTRAN and assembly.

    I wrote my first embedded real-time multi-threading kernel in 1985. It had a clock interrupt that *had* to be serviced every 256 micro seconds on a CPU clocked at 6 MHz. The entire application, including the kernel, fit in 8K bytes of RAM.

    Through the late 80’s and the 90’s I was the systems architect for real-time telecom products you very likely have used.

    In the semiconductor manufacturing industry I wrote code that was instrumental in making the first GHz microprocessors. Much of the code I wrote at that time is still in use today and was used to manufacture many of the components in the computer you’re using to read this.

    I first used Linux in 1996, and am currently involved in two open source projects.

    A few years ago, as a contractor, I warned a company that their real-time storage subsystem would fail under load if they insisted on using Linux and C++. They failed to heed my warning, and after wasting millions in development costs the project was canceled and over two dozen people lost their jobs.

    I’m really curious as to how a PHD in Archaeology and Biological Anthropology makes you an expert in computers.

    One last thing, I’m a woman so I guess you’ve won that dick waving contest after all.

  28. I think rpsms above loses sight of the fact that Greg Laden said “in so far as the language itself goes”, not “in so far as what functionality from other languages can be coded in BASIC with enough budget”.

  29. Swede: “”I have yet to see a tool in anthropological biology that offers anything not available in comparative anatomy, in so far as the tool itself goes”.” I would agree with that statement.

    Virgil: Speed does not count. If they wanted a fast basic interpreter or a fast basic compiler they would invent it! There is nothing inherent to basic coding structure that has anything to do with speed. Or, for the most part, any of the other items that have been mentioned above. Almost every thing mentioned above arose initially as an add on or mod of an existing programming lagnuage that did not have it, and/or at some point in time has been added to an existing programming langauge.

  30. LightningRose: I wrote my first computer program in 1964 at the age of six, like Mozart. I have not mentioned dicks, that was you.

    My objection was not to what you know or don’t know, it was to yo showing up out of nowhere and passing judgment on me. For that, I slap you down. Blogger’s privilage.

    To be fair (and I’m not promising fair) that did cause me to ignore your point about real time.

    You are wrong. For one thing, the one time I needed real time with basic … data collection with input devices attached to a laptop and instantaneous graphing of the results with various options (virtual buttons and dials) to change conditions … I wrote the software and it worked just fine. That was a low demand situation.

    There is nothing inherent in Basic that prohibits the development of a basic interpreter/compiler to run high volume real time operations. If there is, tell me what that is. Don’t tell me about dicks.

  31. There is nothing inherent to basic coding structure that has anything to do with speed.

    Jesus F. Christ. Did you ever as much as OPEN a book on Computer Science? It’s actually painful to read your arguments.

    Your background in Computer Science is not even remotely adequate to take the kind of stance you take here and defend it. You’re so ignorant of basic concepts I can’t even figure out where to begin to try to educate you.

    Dunning-Krueger in full effect, it seems like.

  32. Swede, do you have anything substantive to add? I may well be wrong, but not because what I have said enrages you. That would just be you being enraged. We are not impressed.

    To be honest, I actually expected someone to say something about how Basic simply can’t be Python or C++ or whatever. But no one has. No one has done anything other than to either point out things that other languages have that Basic does not have simply because of what developers chose to do, not because of the inherent features of the language, or to scream and yell at me because I’ve stepped on their territory.

    Come on people.

    Let me suggest an alternative way of thinking about this: What would happen if you wrote a basic interpreter that was nothing other than a translator from Basic to Python. So the coding is Basic as pseudocode, and the output is Python. Would the universe unravel? Woud there need to be things added that could never happen, that would never work?

  33. I started out with substance. You ignored it. You’re dismissing a whole field of science with a wave of a hand and demand, on a silver platter, to know what’s wrong with your immensely naive view.

    But just because you asked so nicely; show me how to write a lambda function in basic. That’s a nice start. You can in Python, I assume, since you’re more knowledgable in this field than Computer Scientists who make a living creating more efficient structures and algorithms, so it should be a piece of cake for you to show me.

    Oh, and your arrogance after ignoring arguments you evidently do not even understand is noted.

  34. Visual basic has a lambda function. They’ve been there for about two years. This demonstrates several things:

    1) If there is some thing you don’t see you can add it.

    2) You are a doof.

    3) Some other things but I think I need another cup of coffee so I’m not going to bother.

  35. Oh, and translating BASIC to Python would be trivial. No problem at all. Translating Python to BASIC would be theoretically possible, given a truly vast development budget and a truly insane amount of time for development and debugging, but the BASIC code output by such a translator would not be human readable. It would be so complex and complicated there would be no way to unravel the original Python code by hand, even for trivial and easy to read Python programs which make use of OO or functional methods.

    Paradigm shifts are a lot more powerful than you understand. Not understanding OO and calling it a “crutch” is akin to trying to do modern biology with Lamarckism instead of genetics. It’s a basic lack of comprehension of the tools available and their role in software development.

  36. Visual Basic .NET has pretty much nothing in common with historical BASIC except the name, and the use of a few keywords. This illustrates one thing.

    1) If a language is broken, reinvent it and keep the name.

    That’s what Microsoft did, and it’s hurting a whole lot of people (including me) since you can’t run old BASIC programs in VB.NET.

    And despite the changes, you still can’t write functional programs in VB.NET. Or in C#, for that matter. Heck, even in F#, which is designed to be functional, it’s a pain to try to make it work. Which illustrates another thing:

    2) Just because you slap a band-aid on to something to add a specific functionality doesn’t make a language capable of handling a new paradigm – in this case functional programming, which is the whole point of using lambda functions in the first place.

    Paradigm shifts aren’t about slapping on a few new functions as “crutches”. It’s about learning a new way to think about programming. If you don’t need to do that, that’s fine, but you have no business trying to educate those who do that it can’t be done.

  37. Swede: You seem to have redefined “basic” to be whatever it needs to be to support your argument. You patched the argument. Your argument is hobbling along on crutches.

    I do agree with you, though. There are important and fundamental innovations across computer languages and that is a good thing. And, some of those chages are unlikely to work as well in Basic as they are in the newer langauges (though I’m not certain of that). That has nothing to do with my premise, however.

  38. I can dig a trench with a hammer, but should I?

    That said I often write hardware control user interfaces in VB because it’s pretty easy. It’s not so easy in, say, PERL, which is great for many other things instead. Tool/Job type mismatch kind of thing.

  39. Heh, *I* am the one redefining BASIC. You’re a hoot.

    You don’t understand what I say well enough to agree or disagree. And you won’t understand that you don’t with the attitude you have. I bet you hadn’t even heard of functional programming or lambda functions until today, and I bet you still consider both local scope and OO to be “crutches”.

    You’ve learned nothing. I waste my time. It can be used a lot better.

    *goes off to watch porn*

  40. Swede: Bugger off. For the record, variable scoping has been part of Basic for quite some time, it is not a crutch, and I’m very familiar with functional langauges and lambda functions.

    I’m glad you learned some things about Basic today. But again, do bugger off.

  41. Greg, I am sorry, but you’re in the wrong on this – visual basic and BASIC is not the same. Dijkstra didn’t talk about visual basic when he wrote his piece in 1975. Not surprising, given the fact that VB 1.0 came out in 1991.

    Yes, VB is derived from BASIC, but it is certainly not the same thing.

    VB.Net is something else entirely.

  42. Kristjan Wager: Visual Basic is Basic. It is nonsensical to say that only a certain pre-1980 version of a language that has subsequently been updated can be used in considering the potentials and limitations of that language.

    HLA is assembly, C++ is C.

    Yes, the comments made by Dijkstra are relevant to an earlier form of Basic, and it may well be true that programming in line-numbered “GOTO” Basic is bad training. But that is not what I’m suggesting at all. Chat I’m suggesting is that Basic could still be a viable and more widely used language than it currently is … and, BTW, I’d say that for all intents and purposes a bout non-VB, but rather, regular (but updated) basic.

    And, of course, one can write in old fashioned Basic and run it it VB after going through all those extra stupic-ass RAD objecty steps.

  43. I take back what I said about no reason to use BASIC. Clearly, if it is what you understand and you can get it to work for you, by all means use it. I don’t really know much about VB, since learning the new system would take effort that wasn’t worth it at the time, but for many purposes Python really is the new BASIC, and as such is supplanting BASIC as the language many people use for small things and for beginners to cut their teeth with. It’s probably not worth your time to learn, but for anyone just starting to code or with much coding ahead of them, Python really does offer advantages over and above (pre-VB) BASIC, and I hear that Ruby is even better. Take a look at some code: http://code.activestate.com/recipes/langs/python/

  44. The definition of BASIC is so unclear, that it makes the challenge undefined.

    Now, the last Basic program I wrote was on an Apple II in 1980. After that, I switched to Pascal.

    So does a more modern “BASIC” support closures, or continuations?

  45. Of course Basic supports continuations and always has. Closure is presumably a feature of VB and other versions of basic with funcitons and scoping and other fancy stuff.

    I agree that Python may be the new Basic, which is a bit troubling considering that Python is less basic than BASIC.

  46. In Soviet Ladensblogistan, Greg trolls YOU!

    Seriously Greg, is this the best use of your limited arrogant-as-you-are-ignorant time? Cite me the ANSI/ISO standard number for this BASIC language you keep nattering on about and then maybe we can talk; without a clear definition of what specific language you’re talking about, discussion is pointless, presuming you value discussion.

    Of late you seem rather fond of saying abstruse or just plain wrong things and slapping around commenters who call you on it. You didn’t write like that two years ago – why the change?

    But hey, if BASIC or VB or whatever works for you, have at it. My first language was BASIC on the VIC-20. Barring sentiment and nostalgia, I wouldn’t touch that language today.

  47. “C++ is C.”

    And there, folks, we have the entire summary of Greg’s argument, and a sparkling example of the depth of his understanding.

  48. Thanks guys for adding these two latest substance free comments. What we are looking at here is a simple case of territoriality. You know that I’m right when I say that BASIC has been dropped as a language for development (I assume you won’t disagree with that) and this, not it’s fundamentals, is why it lacks so many of the features taken for granted in today’s languages. But, I’m not a certified member of the club, so I’m not really allowed to make this argument. And, the primary counter argument to what I’ve said (after the first couple of volleys of utter bullshit that was simply inaccurate or irrelevant) is the simple statement that I’m not qualified, that I’m doing it wrong, and that I need to shut up.

    Bob, I have been annoying you on a regular basis for two years. I don’t get why you keep coming back.

    I have known a LOT of computer science people in my time. All of the smart ones, without exception, eventually moved on to other things. This tells me something.

    Am I being arrogant enough for you? I can turn it up if you like. Oh, and while you are answering that question, tell me something useful about programming languages, not just some shit you looked up on the interent.

  49. Ugh, Greg, sorry to say, but you’re really continuing shooting yourself in foot here.

    You know that I’m right when I say that BASIC has been dropped as a language for development

    In its heyday, BASIC was an easy and quick way to do things, when execution speed didn’t matter. For more “serious” stuff, it was asm all the way. (Or, in other areas, FORTRAN for crunching numbers).
    When I began programming in the early 90’s (with GW-BASIC), it was already only a language for beginners.

    why it lacks so many of the features taken for granted in today’s languages

    Wrong. Today’s languages lack many features taken for granted in today’s languages. Because today’s languages are even more varied than yesteryear’s languages.

    Some examples:

    If you want to do functional programming (and yes, that’s a completely different paradigm, a completely different way of thinking), C (or C++, for that matter) is completely unsuitable.

    If what you need to do can be cleanly/beautifully modelled with objects, you use an OOP language, like C++. Yes, you could do it in C and hack something resembling class and inheritance in, but it just wasn’t created with that in mind.

    If you need lots of regexps, you’d probably use Perl.

    I.e.: You use the right tool for the right job. Tacking feature after feature onto BASIC will only make it cluttered, never a useable functional language.

    And moreover, the article you linked to to strengthen your argument specifically mentioned it’s not talking about VB or VB.NET, but the old BASIC of Dijkstra’s days.

    But, I’m not a certified member of the club, so I’m not really allowed to make this argument

    No, you’re only ignorant about the topics you’re arguing about.

    What you’re doing is basically the equivalent of me, who’s only knowledge of archaeology is what he saw in some videos on TV and YouTube, saying that the Inca, Maya and Aztecs were the same people.

    I have known a LOT of computer science people in my time. All of the smart ones, without exception, eventually moved on to other things. This tells me something.

    That you’ve listened to the Advanced Trolling lecture at the GSoC 2009 Mentor Summit?

    I mean, seriously, come on.

    tell me something useful about programming languages

    Once the code has been compiled, just by looking at the asm/disasm, you can’t find out for sure what language it has been. You can only make educated guesses. And tranforming said disasm back to something readable is tedious work.

    Which brings me to the point: While, yes, all Turing-complete languages are on the same level of power, so to speak, it’s still far easier to model certain problems in a certain manner, other ways taking a lot of more work. And this includes, for example, creating fast compiled executables.

    You probably don’t even write your code in a text editor

    I for one do. I use vim.

  50. BASIC has been dropped as a language for development (I assume you won’t disagree with that)

    Said after referring to VB.NET which is in widespread use today as a part of BASIC. Your ignorance on this subject is only matched by your dumbness on it, it appears.

    and this, not it’s fundamentals, is why it lacks so many of the features taken for granted in today’s languages

    And here is the heart of your ignorance. It’s not a matter of “features”. If you had taken a single class in Computer Science you might have known this.

    But no, any hack can be a Computer Science expert. All it takes is that one writes one code in a TEXT EDITOR and one has a better understanding of Computer Science than people who have spent years learning its concepts and struggled long and hard hours to understand just why OO and functional paradigms are different than imperative paradigms.

    Your “expectations” in a previous post confirm this. In your little world everyone programs imperatively, just with different “features”, and some of them, like OO, are just “crutches”.

    You’re a dumb, ignorant ass, and you’re arrogant about it as well. How was it, “bugger off”, yeah, sounds about right. Having gone through the work of gaining skill in one academical area apparently doesn’t lead to the comprehension that other academical areas also requires some work to understand.

    And I just wasted give more minutes I could have used for something more useful, like stabbing a fork in my leg, and you’re just going to dismiss this comment with a handwave anyway, since you write your code in a text editor. Ah well, my loss.

  51. FrogSpit,

    “C++ is C.”

    And there, folks, we have the entire summary of Greg’s argument, and a sparkling example of the depth of his understanding.

    Yes, that one is up there together with “if we came from monkeys, why are there still monkeys?” in displaying complete and utter ignorance of the subject matter.

  52. DrMcCoy,

    The comment system breaks my tags too. It breaks them differently in preview and in post as well, to make things interesting.

    In my previous comment, the italic was supposed to cover two paragraphs, not one. The code did, but the result didn’t – I bet that was because I used an edit box instead of a text editor.

  53. Dr McCoy, I hope your coding in “real” languages is a bit better than in HTML

    Oh, my HTML is fine. It’s your blog that’s at fault by adding paragraph tags on its own, in wrong places. Kinda like you, adding an extra space into my nick.

    But seriously, WTF, Greg? Blatant ad-homs? I expected more from you.

  54. fwiw, this (and many other) comment system has a tendency to correct less than signs with less than entities.

    I don’t think that I have a lot to add to this argument other than the one reason I prefer python to any other language (including applesoft basic) that I have ever used, which is that by it’s enforced structure, I can come back to code I wrote 3 years ago and still know what I wrote. I am, by nature, not the most orderly of people, and as such, the ‘crutch’ (if you can call it that) of python for me is enforced structure and neatness.

    that, and that unless you try really hard to be obscure, python reads like pseudo-code. which means that I can generally understand what ‘you’ have written 3 years ago as well.

    on the other hand, this argument is really about beauty, not ability. so hotly contested opinions are to be expected. be honest, you sort of knew that this was going to be flamebait didn’t you? for the record, I use emacs… vim is for people who need to go to the other room to use the sink… 😉

  55. Peter, I agree, that is a very good feature of Python. Although it is possible to ruin the code by breaking up the functions into the most esoteric possible arrangements and overloading implied tokens in control structures. If your main objective in life is to write a one line Python progam preceded by an arbitrary number of lines of function code, you might be a perl progammer. Like being a redneck.

  56. Reading the comments above made me realize once again why humans are mortal. There is nothing that stops life to evolve as immortal. At least organisms could evolve to live forever unless killed. But even if people stop aging biologically they will continue to age psychologically: and after several decades or even after a couple of centuries they’ll be so stuck up that they’ll be completely inadequate. So that’s why old people need to die and clear the path for new generations of ideas.
    It is not the biology that would be the problem with living for a long time. It’s the psychology.

Leave a Reply

Your email address will not be published. Required fields are marked *