[clue-talk] Stupid OCaml tricks

Matt Gushee matt at gushee.net
Mon Aug 14 16:16:59 MDT 2006


Nate Duehr wrote:

> On Aug 14, 2006, at 12:55 PM, Matt Gushee wrote:
> 
> [snipped long example of various OCaml vs. something else programming...]

Not really. You could do it either way in OCaml.

> "They're both doing the same things at the hardware level."

Sure, but I'm talking about an *interactive* program, not batch 
processing. What happens when is crucial.

> I very much disagree with your statement that performance is more 
> negatively affected by the first example vs. yours.

Perhaps I should have qualified that statement a bit. The key point is 
that in my version the branching is restricted to the time when the user 
specifically invokes a mode change, which in my opinion is the time when 
delays will be most tolerable. (BTW, I notice you haven't really 
addressed my other point, about the design of the program).

> My recent favorite article I keep coming back to and pondering its depth 
> is this one about "Leaky Abstractions" from the Joel on Software guy:
> 
> http://www.joelonsoftware.com/articles/LeakyAbstractions.html
> 
> It seems to me that more and more and more languages just makes the 
> above problem worse.

I don't think the number of languages *in existence* has anything to do 
with it. Surely, the number of *layers* in any given system does. And I 
suppose if any given individual or group is involved with too many 
languages to master any one of them, that's obviously an issue. But the 
fact that something is new (or different) doesn't mean it's badly 
designed or implemented. Java vs. C#? Sure, we dislike Microsoft for 
various reasons, and we know they've released a lot of crappy software. 
And C# is specifically a me-too language. Nonetheless, it has been 
argued by people who have no particular axe to grind that C# is 
better-done than Java. I don't really know if that's true or not, and 
don't really care--I could come up with a dozen similar examples, some 
of which will be true. BTW, OCaml isn't all that new--it's been around 
since 1985, and has its roots in Standard ML, which existed for maybe 20 
years before that.

> I'm not sure how you get the typical coder off of 
> his/her huge interest in "new things" long enough to explain that new 
> languages are just "yet another boring abstraction tool"

They are that, but they're not just that. Maybe for someone who's such a 
friggin' genius they can look at any programming language and see 
straight through to the assembly code behind it, it makes no difference 
what language they use. Most people aren't quite that smart (or maybe I 
should say their intelligence is focused elsewhere), so I will insist 
that the structure and syntax of the language--and the idioms--can make 
a big difference in how effectively you can think about the program--and 
thus how productive you are in going from design to implementation.

> and get them to 
> focus in on solid code writing and quality output.

Always important. I don't disagree with you there.

> Maybe it is best summarized this way: If two programs produce the same 
> inputs and outputs and are written in two different languages... Who 
> cares?  Only the programmer.  The end-users don't care about what it's 
> written in, and never will

Certainly the users don't care, and that's probably a good thing. 
Because if they did care, they would probably be demanding software 
written in Visual Basic or Java.

That's semi-facetious, but the point is--as I'm sure you agree--good and 
bad code can be written in any language, and their fundamental 
capabilities are all pretty much the same. But that's very different 
from saying it makes no difference which language you use. It makes a 
big difference because (a) any given developer is likely to think better 
in one language than another; and (b) the quality of the tools and 
libraries varies greatly between different language implementations--or 
perhaps I should say different toolkits are good at different things. 
Java is good at security and internationalization, but pretty bad at 
performance. Visual Basic is good for rapid GUI development, etc.

So people/organizations should use whatever tools help them to deliver a 
good product. I'm not saying everyone should use OCaml, just that it's 
an alternative that might work better for some.

> I'm not sure I've shaken this whole idea of "silly multiple language 
> wastes of time"

What's silly? The fact there are many different high-level languages 
that all translate to the same bits? Okay, if you really want to be a 
fundamentalist, go ahead and say it--we should all just code everything 
in assembly. But I can out-fundamentalist you any day: we'd all be 
better off if PCs had never been invented.

However, we live in a world where a lot of non- and semi-technical 
people use computers and expect to have access to a wide variety of 
complex, high-level applications. Tell me how that demand can be 
satisfied by coding everything in assembly, or even in C.

> hopefully there's enough evidence there that I'll avoid "learning 
> another new language" for no reason at all,

If you have no reason to learn something new, then don't. I never said 
you had to.

> and instead have the 
> discipline to work on refining the code written in whatever language I 
> already wrote it in.  So maybe this revelation will help me personally 
> to stop screwing around with whatever language fad has hit this year and 
> work on a longer-term goal of writing stuff that works so well, no one 
> cares what language it was written in.

Sure. OCaml isn't a fad, though. It's got a lot of solid theory and 
practice behind it, and a small but smart and serious community around 
it. My presentation is likely the most hype you'll ever see about it. 
You want fads? Try Ruby on Rails.

> Just some thoughts... what do you think?  I really think evangelizing 
> ANY language is far less useful than just working on stuff... at this 
> point.

You may be right, but then why have meetings and presentations at all?

And personally, I'm not a big fan of technology evangelism in general. 
One of the reasons I didn't do well as a "technical instructor" was that 
a large part of the "instruction" was supposed to be uncritical 
evangelism of XML, about which I always had some skepticism. For what 
it's worth, I couldn't advocate any technology unless I truly believed 
it was worthwhile, and even then I try to be honest about its shortcomings.

> Faster and faster machines make it "not worth looking at" in most 
> applications.

That's an interesting point. I think it depends partly on who your users 
are. If you are an enterprise developer, nobody cares about performance 
because they can just buy more hardware and recover the costs from their 
customers. If you are targeting home and small-business users I think 
performance still matters--though not necessarily in the sense of raw 
number-crunching performance. But the responsiveness of a GUI, for 
example, is very important, and often neglected because it isn't 
glamorous (and you can sell millions of $$ worth of software before 
people find out how badly it sucks). If I were still trying to make a 
living as a developer I would be targeting the latter market, probably 
small businesses in particular, because those are people that I respect.

> Joel's comments above about knowing how the hardware works all the way 
> down to the bit level, is very interesting.  If you had to guess, out of 
> all the developers you've met, how many have coded in Assembly on their 
> chosen work platform?  Not many, I'd say.  Ever meet someone who has 
> coded in Assembly on their chosen platform that didn't know their 
> "stuff" ten times better than everyone else you ever worked with on that 
> platform?  Not me.  Anyone who's had the discipline to go to that level 
> -- is always an EXCELLENT programmer.  They usually have amazing insight 
> into how to get the most out of the chosen hardware platform too.

That's a valid point, and I think people like that (like you, maybe?) 
deserve more recognition and respect than they often seem to get. But 
that's not the whole picture: someone needs to design applications for 
users--*for people.* Very few people are good at both aspects of the 
job, but both have to be done well to create a really good product.

> Take (for another example) GUI interfaces.  Has any REAL paradigm change 
> happened in them since the machines at Xerox added a mouse?  Sure 
> they're prettier, but not a single bit easier to use for anyone.  Some 
> people take the time to try to make them as good as possible, but it's 
> so rare... they'd rather switch from raw X coding, to Qt, to GDK, to god 
> knows whatever else every few years and never produce anything really 
> different or better,

True. The GUI toolkits just keep on getting more complex. But I don't 
think either going back to basics (raw Xlib) or continuing to add layers 
is the solution. Widget libraries give you productivity in creating 
applications, but that doesn't mean that more is better. I think there 
must be a happy medium somewhere.

-- 
Matt Gushee
: Bantam - lightweight file manager : matt.gushee.net/software/bantam/ :
: RASCL's A Simple Configuration Language :     matt.gushee.net/rascl/ :



More information about the clue-talk mailing list