[clue-talk] Stupid OCaml tricks

Nate Duehr nate at natetech.com
Mon Aug 14 14:30:25 MDT 2006


On Aug 14, 2006, at 12:55 PM, Matt Gushee wrote:

[snipped long example of various OCaml vs. something else  
programming...]

Matt,

While reading your comments about performance being affected by  
checking the state of various things, and then reading your OCaml  
solution, all I could think of was...

"They're both doing the same things at the hardware level."

I very much disagree with your statement that performance is more  
negatively affected by the first example vs. yours.  The data is  
handled differently in the programmer-to-screen interface, but under  
the hood, the chips are doing the exact same lookups and it'll take  
about the same amount of time.

Actually, it's all about the low-level code in OCaml's interpreter or  
the the compiler for the other language.  Even if you'd given Perl  
examples or some other "scripting language", the raw twiddling of the  
hardware bits is still the same.

In fact, you start to make my point above when you talked about there  
not being much *real* difference between an interpreted (scripting)  
language and a compiled one.

My recent favorite article I keep coming back to and pondering its  
depth is this one about "Leaky Abstractions" from the Joel on  
Software guy:

http://www.joelonsoftware.com/articles/LeakyAbstractions.html

It seems to me that more and more and more languages just makes the  
above problem worse.  I'm not sure how you get the typical coder off  
of his/her huge interest in "new things" long enough to explain that  
new languages are just "yet another boring abstraction tool" and get  
them to focus in on solid code writing and quality output.

Maybe it is best summarized this way: If two programs produce the  
same inputs and outputs and are written in two different languages...  
Who cares?  Only the programmer.  The end-users don't care about what  
it's written in, and never will -- other than some end-users who are  
developers and have their "pet" languages they like to "support" by  
using things written in that language.

I'm not sure I've shaken this whole idea of "silly multiple language  
wastes of time" out in my head well enough to produce a concise, well- 
written article on the topic that would pursuade anyone, but  
hopefully there's enough evidence there that I'll avoid "learning  
another new language" for no reason at all, and instead have the  
discipline to work on refining the code written in whatever language  
I already wrote it in.  So maybe this revelation will help me  
personally to stop screwing around with whatever language fad has hit  
this year and work on a longer-term goal of writing stuff that works  
so well, no one cares what language it was written in.

Just some thoughts... what do you think?  I really think evangelizing  
ANY language is far less useful than just working on stuff... at this  
point.  Maybe I'm just getting old and grumpy.  :-)  ?

Example: I can probably find ten websites that do the analysis you  
did -- why the language looks prettier to the programmer and why they  
should use OCaml.  What I can't find in today's undisciplined  
environment is someone who's looked hard at the underlying code and  
done benchmarks of how fast/how well the real bit twiddling is done  
under the hood.  It's so time-consuming, and no one can get paid to  
do it, that it's just not the type of introspection we see much in  
the IT world these days.  Faster and faster machines make it "not  
worth looking at" in most applications.

Joel's comments above about knowing how the hardware works all the  
way down to the bit level, is very interesting.  If you had to guess,  
out of all the developers you've met, how many have coded in Assembly  
on their chosen work platform?  Not many, I'd say.  Ever meet someone  
who has coded in Assembly on their chosen platform that didn't know  
their "stuff" ten times better than everyone else you ever worked  
with on that platform?  Not me.  Anyone who's had the discipline to  
go to that level -- is always an EXCELLENT programmer.  They usually  
have amazing insight into how to get the most out of the chosen  
hardware platform too.

But then the lazy folks write another language and obsolete the poor  
guy, and poof... they're gone.  Out of the industry, usually  
consulting to companies still using the old tech for years afterward,  
still the hottest programmer on that platform anyone's ever seen, but  
the crazy mob-mentality IT world that needs the next new shiny thing  
has moved on... still not producing anything even an order of  
magnitude better than the original.

Take (for another example) GUI interfaces.  Has any REAL paradigm  
change happened in them since the machines at Xerox added a mouse?   
Sure they're prettier, but not a single bit easier to use for  
anyone.  Some people take the time to try to make them as good as  
possible, but it's so rare... they'd rather switch from raw X coding,  
to Qt, to GDK, to god knows whatever else every few years and never  
produce anything really different or better, but they worked hard on  
their INTERFACE to the machine to get it to produce the SAME thing.

Hopefully all this rambling makes a little sense to someone?  What  
I'm mostly pondering these days is... how will it ever change?   How  
to motivate new coders to write stuff WELL instead of caring about  
what language it's in.  Know a language, know ten if you like, but  
learn how to code properly first.... would be the sentiment I'd be  
interested in conveying.

Ah well, just some random thoughts for a Monday while I'm trying to  
avoid digging through a log file to find a problem... back to work...

--
Nate Duehr
nate at natetech.com






More information about the clue-talk mailing list