The World's Most Maintainable Programming Language: Part 3

by chromatic

Have modern programming languages failed? From the point of view of learnability and maintainability, yes! What would a truly maintainable and learnable programming language look like? This is the third of a six-part series exploring the future of programming languages (read The World's Most Maintainable Programming Language: Part 1, The World's Most Maintainable Programming Language: Part 2, The World's Most Maintainable Programming Language: Part 4, The World's Most Maintainable Programming Language: Part 5, and The World's Mode Maintainable Programing Language: Conclusion).


2006-04-03 03:34:05
'Because the goal of the language is to be as easy to learn as possible, it must use only a few primitives. Why? Compare decimal math with hexadecimal. People who haven’t already studied programming or higher mathematics find decimal much easier to use. It’s obvious why; it uses almost 40% fewer primitives, lacking A - F!'

Hmmm, not so sure about this. I think what you're really saying is people who haven't used hexidecimal already find decimal easier to understand. This is obvious. It is to do with familiarity. Binary has fewer primitives than either hex or dec but do people find it easier to deal with - of course not.

I admire the principal of writing a series of articles such as this but I think a lot of your arguments are convoluted, overly wordy and often way wide of the mark.

2006-04-05 04:13:20
[Java] rightly eschews operator overloading in general, as it is difficult to explain and fiendishly difficult to implement in practice

Bullshit. Operator overloading is quite simple (it's nothing more than writing a method), and it's the only way to maintain consistency between builtins and user-defined types.

2006-04-05 06:41:38
primitive != easy to use. Consider a human language with 100 words. Instead of "train" you have to say something like "the thing that runs on rails". The sentences in that language will be terribly verbose and full of duplication. That's why higher level concepts and abstractions are needed.

mysql_connect() and pg_connect() are also really bad examples. Have you heard of polymorphism?

2006-04-11 07:46:44
As I have stated before. The language should grow with the ability of the one that uses it. You don't give first graders Dostoyevski to read. Some are never able to read him.
So give the beginners some constructs they can do something useful with but also provide all the means an expert would use. If the beginner cannot understand it, it is because he is a beginner, and needs to learn more, not because the expert wasn't able to write it down better (although that can be the case too, but then it wasn't an expert).

2006-06-23 10:43:08

I completely agree with Ben, decimal is not easier to understand because of the number of primitives; it is just what people are used to. Limiting the number of commands in your computer language could limit its usefulness. The number of commands should not be limited, but should be set so that the user for the language is capable of creating useful programs.

Eliminating options does not make something easier to learn. Everyone learns differently and everyone has his own preference when it comes to speaking, so why not in programming? I'm not in favor of redundancy, but if there are two ways to do the same thing, a novice still only needs to learn one way, but later he may find one way easier than the other. Eliminating synonyms seems a lot like Aldus Huxley's "newspeak" from Brave New World which we know was not double plus good.

2006-10-02 19:32:56
... continued

Here's an example I came up with:

Patient p1;
Patient p2;
if ( p1.isDonorMatch( p2 ) )

Now is p1 the donor? I hope so. I see this kind of code all the time too. It is not immediatley clear which instance is which. I would have named p1 'donor' and p2 'recipient'. That would clear things up. Also, the isDonorMatch function is confusing too. Which patient is the donor - the invoker or the passed in paramter?

2007-02-07 17:16:29
I'm not a programmer but I understand the basic concepts. I just think the progammer needs to program more effeciently, rather it be making their code readable with standard naming conventions, or utilizing memory effeciently.
I think that as our hardware technology gets better, programmers are programming uneffeciently and wasting memory utilization in their code.
No one is breaking down programming bit for bit anymore, are they? There is more memory and faster IO nowadays so I assume programmers are getting sloppy when compared to programming 15 to 20 years ago. Twenty years ago you either programmed using resources effiently or you got bad performance. It really doesn't matter to these programmers today as they can program sloppy and still get pretty good performance.
From what I'm getting is that you should stick to the fundamentals of programming and think of it bit by bit. I'm not a programmer and I'm only 27 years old so I don't care if this makes sense or not.

Later you code slangers

2007-07-18 21:43:34
I take it you guys have never heard of the "One instruction set computer"?

OISC has only one instruction (subtract and branch if negative), but I don't think it makes programming more enjoyable whatsoever!!

Have fun in the Turing Tarpit :)