[CLUE-Tech] Data models, diagrams, process models documentation

Jed S. Baer thag at frii.com
Sun Aug 17 11:35:10 MDT 2003


On Sun, 17 Aug 2003 10:10:19 -0600
David Anselmi <anselmi at americanisp.net> wrote:

> Sean LeBlanc wrote:
> [...]
> > [*] And I really can't grok people who just make database changes
> > directly via tools like SQL Enterprise Manager, isql, or SQL*Plus or
> > what have you.
> 
> Hear, hear!  At my last job, part of the CM I did was to develop a way 
> to apply DB changes consistently.  That can be a tricky thing and what I
> wrote wasn't bulletproof.  But at least I could be confident that the 
> testers were testing what would go to production.
> 
> We did a major rewrite at one point and the data migration scripts (that
> fit into my CM framework) were very hairy.  When we rolled it out, the 
> code update took 10 minutes and the data migration 2 hours.  I left at 
> that point, figuring it was done but our DB guy checked a few things and
> found a bug.
> 
> Instead of rolling it back and fixing/testing the bug he stayed until 
> 3am hacking on it.  Monday morning all was well.  Tuesday things blew up
> because of bugs he put in while hacking, two days to fix.  Fortunately 
> this was a rare occurance.

( I'm not sure of where the boundaries of this question are. In Sean's
original post, it sounded as if he were comparing running a
human-generated (set of) canned conversion scripts/programs from the
command line, vs. "push button" execution via a tool. In the context of
Dave's reply, Sean's openning sentence sounds more like "type each
individual command seperately", which I've never seen anyone try to do on
a production box. I have seen single-statement production upgrades done
via cut&paste into SQL*Plus.)

Let's not confuse the tool with the outcome, or the process. I've done
plenty of production turnovers without using any of these high-level
tools. There are many factors which come into play in determining whether
the smarter/harder question always turns in favor of the high-level design
tool. In practice, one of the major factors (particularly in the case of
behemoths, such as Oracle Designer) is whether people actually take the
time (or have the time, in some cases) to come up to speed with them.
Another factor lies in trusting the tool too much, as "It was generated by
{tool name}, so it must be right". High-level tools, deployed properly,
can be big time savers, particularly as they mature. Oracle Designer, for
example, in 1996 didn't support some Oracle server features which were
critical to a project I was working on. Last I used it (late 2000), it
still had a different namespace model for database objects name than that
in use by the Oracle server, but it was still vastly improved for most
database design tasks.

Dave's example is perfect for illustrating the failure of process, not the
tool. High-level tools can make many pieces of the process easier for
humans, but human failure to adhere to the process, whether it involves
scripts generated by a programmer, and executed from the command line, or
generated and executed from within a high-level tool, is the ultimate
cause of the majority of implementation failures.

jed
-- 
... it is poor civic hygiene to install technologies that could someday
facilitate a police state. -- Bruce Schneier



More information about the clue-tech mailing list