ONJava.com -- The Independent Source for Enterprise Java
oreilly.comSafari Books Online.Conferences.

advertisement

AddThis Social Bookmark Button
Article:
  WinFX: An All-Managed API
Subject:   .NET, Windows API and C
Date:   2004-03-25 05:50:23
From:   igriffiths
Response to: .NET, Windows API and C

Having written in various different assembly languages, C, C++, and C# (and various other langauges), I cannot agree with your statement that "despite the popular belief there is no such a comparison between assembly language, C and C# for example."


There is a clear progression of increasing levels of abstraction:



  • Assembly language maps directly onto machine instructions (and given the way that all modern CPUs are implemented, those are abstractions around what the CPU really does).

  • C moves away from the specifics of the instruction set, but still provides no significant extra functionality - you're still using the same abstractions of raw memory and basic data types such as int and float. (And of course any decent assembler will also let you define structs.) So the only real level of abstraction that C offers is to take you away from the specific instruction set. It doesn't add anything else. There's a reason it's frequently described as 'machine-independent assembly language' - it provides the same set of abstractions, but in a more or less machine-independent form.

  • C++ adds some higher-level features. If you want virtual functions, you are no longer obliged to implement the mechanism yourself as you would have to in C or assembly language. Also, if you want compile-time polymorphism, templates can do that rather than having to rely less powerful and more error-prone macro expansions. And because these features are integrated into the compiler, you get much better compile-time static type checking. However, you're still basically dealing with raw memory, which is, significantly, the main reason it's so easy to write code that has buffer overflows in both C and C++ (and assembler).

  • C# and Java move away from making virtual memory the basic abstraction for how information is stored, moving to a managed, intrisically typed heap. This is hugely important - think how many of the security holes out there today are caused by buffer overflows. Buffer overflow bugs are an artifact of dealing with memory as your basic abstraction. This is one of the key benefits of moving to C# or Java. Of course this means you can't use these languages to write interrupt handlers and device drivers. That's why I would still use C for that kind of code. But that's really all C is appropriate for these days.


  • The fact that anything you can do in C# can also be done in C clearly also carries down to assembly language too, so I don't see why you think there's no comparison - it's a clear progression of abstraction: assembler abstracts away from the transistors, C abstracts away from the instruction set, C++ adds abstractions not intrinsically available in assembler, C# and Java abstract away from memory and pointers. And the benefit of this progression is evident in use: just as C usually requires much less effort to get something done than assembly language, so C++ can make things less effort than in C, and C# usually requires much less effort to get things done than C++.


    C was the highest level language you could reasonably run back when the Mac and Windows first came out, but this wasn't because C was an ideal language - it's clearly not! Remember that the Lisa, the Mac's predecessor, had already tried to use an object-oriented langauge. (The Lisa toolkit was written in an OO-enabled derivative of Pascal called Clascal.) The Lisa was ahead of its time in many ways, including the attempt to use an OO language before the hardware was powerful enough to support this in real apps. So it *is* about hardware limitations, or at least it was as far as the people who designed the systems you gave as examples were concerned - people recognized the productivity benefits of moving away from the low level of abstraction offered by C 20 years ago, but back then, personal computers weren't up to the job.


    Of course you can use an OO style of development in C. I've done this, and it's an experience I'm in no hurry to repeat. It is so much more effort than using a system which was designed to support this style of development in the first place. You can use an OO style in assembly language if you really want to, but that doesn't make it the right technology with which to write a web site. Why on earth would you expend so much effort doing it by hand when there are languages and tools available to do it for you?


    Apple would have used an OO language if they possibly could for the Mac. The only reason they didn't is that they already tried that on the Lisa and found it was to heavy to fly.



    If your .NET windows applications are taking 400MB of memory you are doing something wrong. In fact if they're taking 100MB of memory there's a good chance you're doing something wrong. (Unless you really are working with that volume of data in your calculations, in which case a C application would use just as much memory.)


    This is not evidence of a problem intrinsic to the .NET framework, because none of my .NET programs do this. This is much more likely to be indicative of implementation deficiencies in the program(s) you were looking at. It's most likely to be a case of using inappropriate techniques. A different platform requires different techniques. If you wrote code in C that assumed memory was managed in the same way as it is in C#, you'd have a leaky malfunctioning piece of code, and you would be right to accuse the author of incompetence. So why do you leap to different conclusions when you see code doing the wrong thing in C#? Just because it is possible to write code that hemorrhages memory in a particular language doesn't mean it's necessarily the language that is at fault. I can write you a C program that takes 400MB of memory too, but that wouldn't be evidence of a problem in C. It would just be a program that takes 400MB of memory for some reason...


    I've got a .NET windows forms application running right now that I wrote. (It's a utility that takes syntax coloured code in RTF format and converts it into HTML.) I'm running the debug build as it happens, and because it's a very simple app for my own use I've never bothered to try and optimize its memory footprint, load time, or execution speed. It is, as you put it "a program consisting of a simple form with a few controls on it and doing nothing more than sitting idle." It's taking 14MB of memory, which is normal. So no, it's not normal for such .NET programs to take 100MB. There's clearly something wrong with the program you're using as an example. And yes, 14MB is more than the equivalent in C would have taken - maybe twice as much. But I have 1GB of memory on this laptop, and it took about 5 minutes to get get the UI working. It would have taken far far longer than that to build the UI in C. (It does automatic resizing of all the content, and interacts with the system clipboard. These are certainly more than 5 minute jobs in C. They are utterly trivial in C#.) The only thing that took any amount of time to write was the RTF parsing - in other words, I was able to devote all my time to the problem at hand, rather than messing about with the vast number of little details required to write a GUI in C.


    So it's just untrue to say that even simple .NET applications are implicitly ludicrously memory hungry. They're a bit more hungry, but I certainly don't see hundreds of megabytes getting used by my own applications. If you're using vast wads of memory it's highly likely to be because of what the program is doing, rather than the platform it runs on.


    (For example, the RSS reader I use, SharpReader, is using 72MB right now! People sometimes point at SharpReader as an example of how .NET applications use unnecessarily large amounts of memory, but it's a flawed argument. As the guy who wrote SharpReader points out, its memory usage has nothing to do with .NET - it's because he wrote the application in such a way that it holds vast amounts of stuff in memory. If he had written it in C, it would use exactly as much memory, simply because of how it works. As I understand it, it actually loads all the content of all the messages into memory. That's why it takes up lots of space. If you loaded the same state into memory in a C application it would be just as bad. C# applications that don't work with such large data sets don't have such large working sets.)



    "About the faster machines - on my P4 2.4 Ghz Java runs just as slow as on my P2 300 Celeron, that's all marketing tricks but is seems you and many other people just don't get it."


    Marketing? All I see is that my productivity has gone through the roof, and the performance of the systems I work on is good enough. What on earth does that have to do with marketing?


    To give you a recent example, I worked on a rewrite of a system from VB6 to C#. It took three months to rewrite the system that had taken 3 years to write in VB, and it runs approximately 10 times faster. We know this because we measured how fast it runs, and we measured how long it took to write. It doesn't get much simpler than that - what makes you think this has anything to do with marketing? It's simple straightforward good business!



    I was initially highly sceptical when I first saw the .NET framework. Like you I didn't see any obvious benefit gained in exchange for the loss of control.


    But then I actually tried using it. With an open mind. After one project, I realised I had been wrong. Now, many projects down the line, I won't use C++ unless I have a particularly good reason to. (And certainly not C - why on earth would I want to invent my own idioms for object orientation that the compiler doesn't understand, and can't type check? I have better things to do with my time!)


    With all due respect, it sounds an awful lot like you've not tried writing a non-trivial project in C#, so your opinion is not informed by experience. (You talked about a project porting a VB application to .NET - that's always going to be a problematic thing to try and do because VB and VB.NET use a different set of abstractions. This is going to be about as easy as porting an ADA program to C - you'd have loads of problems doing that too, but would you use that as an argument against C? And you cite "finding obscure bugs" as one of the problems - I don't think you could honestly say to anyone that C programs never suffer from obscure bugs and expect them not to laugh in your face.)


    What's all the more remarkable is that all the arguments about not reinventing stuff and having things done for you which you put forth in favour of C are arguments that I would use in favour of C# - all the same things apply, but *much* more so because C# and .NET do so much more for you than C and Win32 ever did. I use both .NET and Win32 regularly, and believe me, when programming against Win32 you spend *far* more time doing things which in .NET are done for you. And a lot of it is down to the higher level of abstraction provided by the platform - you simply couldn't have all the things that C# automates automated in C, because it would be incompatible with C's "everything is bytes stored in memory" level of abstraction.


    "I've seen a very complex but briliantly written programs on plain C. One example is the Mosaic code. And i've seen a really sloppy programs written in C++, Java and C#"


    Sure, and I've seen lots of incompetently written C and lots of great C++, Java and C#. Such an Bad code is bad code, good code is good code. What does that have to do with choosing a language?


    Assume for a moment that we're going to write good code. Doesn't it then make sense to choose the platform and language that gives us the best productivity? I've tried C, C++, Java and C# - I've written substantial amounts of code in all of these languages on a variety of platforms and systems - Windows, Unix, Mac, and various embedded systems. C# and Java blow the rest away when it comes to productivity. You almost invariably end up writing far less code to get anything done. Less code means less initial work, fewer bugs, and lower maintenance costs.


1 to 2 of 2
  1. .NET, Windows API and C
    2004-04-13 15:48:48  Eric_Mutta [View]

  2. .NET, Windows API and C
    2004-03-25 15:31:21  anatolk [View]

    • Ian Griffiths photo .NET, Windows API and C
      2004-04-01 04:50:07  Ian Griffiths | O'Reilly Author [View]

      • .NET, Windows API and C
        2004-04-02 10:05:42  anatolk [View]

        • Ian Griffiths photo .NET, Windows API and C
          2004-04-03 05:30:22  Ian Griffiths | O'Reilly Author [View]

          • .NET, Windows API and C
            2004-04-03 19:09:32  anatolk [View]

1 to 2 of 2