[GRLUG] sysadmin job opening

Bob Kline bob.kline at gmail.com
Mon Feb 1 12:04:44 EST 2010


On Mon, Feb 1, 2010 at 11:24 AM, Michael Mol <mikemol at gmail.com> wrote:

> On 2/1/2010 11:04 AM, Bob Kline wrote:
> > One could throw in parallel programming.
>
> As what? You didn't intersperse your reply, so I don't know what this is
> in reply to.
>

Is interspersing top posting, bottom
posting, or just common sense posting?
Just asking.

OK, this is getting a little tedious, if
not fruitless, so to wind down, the
point was in regard to your list
of ways to speed things up:

**********************
>
>     * Throw more/better hardware at the problem.
>     * Take advantage of caching in more places
>     * Refactor the code to meet different design requirements.
>     * Refactor the code because you learned how to better write to the
>     langauge
>     * Change your execution environment.
***********************

>
> >
> > But what emerges is that someone
> > has to decide how to balance cost,
> > performance, and who can do what.
> > Knowing the strengths and weaknesses
> > of most platforms and languages today
> > is probably beyond any one person, so
> > the decision is inherently difficult.
>
> One doesn't need to know everything, or even most things, in order to
> have a broad enough toolkit to know when to pull out a screwdriver
> instead of a claw hammer.  It's good to be passably proficient in
> multiple languages, so you know when one tool is likely to be better
> than another.
>

I would argue that with some many
languages, one's personal toolkit
might miss the mark.  One could get
in to the marginal utility of using yet
another language as opposed to
ensuring, say, maintainability.  i.e.,
a little COBAL piece might not be
good for long term maintainability.

>
>  > Individuals
> > often argue for what they know - both in the
> > hardware and software side of things.
>
> I don't understand how this is relevant.
>

It's relevant because people will often
choose what they know because they
are limited by that perspective.  Only
people with broad knowledge and-or experience
can choose what is by some measure best.
And of course anything that becomes
commercially important will likely be
overhauled at some point anyway.  Often
ideas and concepts transcend any actual
implementation.


>  > Having
> > to come up to speed on something new
> > guarantees that amateurs will be doing the
> > job, and the results will reflect this.
>
> A proficient programmer needs to be at least passably familiar with
> multiple tools. Otherwise, he's not much better the guy who gets hired
> because he's good with a sledgehammer.
>

By definition of proficient programmer.
And of course multiple tools can mean
a lot of tools today.  Are many people
actually like that?  And of course if you
need a person good with a sledgehammer,
that changes things too.  Management is
a tough business when done right....

>
> It's also true that in the computing fields, the cost equation for how
> to get the best computing bang for the hardware buck leads to constant
> shifting around in the hardware and network architecture. Twenty years
> ago, it was thick-client software. Ten years ago, it was the gigahertz
> race. Recently, it's been the multicore race.  Soon, it's going to be a
> split between breaking problems into massively parallel work sets and
> making things run on processors without some of the in-CPU optimizations
> we've come to depend on.
>
> And, certainly, things will shift in a different direction within the
> next fifteen years.
>

This mostly says there is some kind
of progress on the technology front.
If past performance is any guarantee
of future performance (...) there will
indeed be more changes, just by
extrapolation.

>
> All of this leads to new languages, new libraries, and new requirements
> for how one thinks about a problem.
>

Yes, assuming some kind of metric
shows things are better, as opposed
to just different.  That happens too.
Every field has elements of fashion,
and fad bandwagons.  Mostly temporary,
and devoid of substance.  The new thing
on the block is almost always better.  For
a while.

>
>  > If
> > engineering departments could get around
> > these points, then one would have demonstrated
> > the intelligence of a mob.  But any working
> > person knows that a camel is a horse designed
> > by a committee.
>
> I don't know where you're going with this, unless you're trying to say
> that langauges like PHP try to be too many things to too many people.
> However, that would seem to conflict with your implied argument that
> C/asm should be enough for everyone. If that were the case, what other
> features could a language designed by committee possibly pick up?
>

It's not going anywhere - it's for discussion's
sake.  The issue is how get things done in
the most cost effective way over the longer
run, and what factors go in to that.

>
> >
> > Perhaps the proliferation of "languages" has
> > simply complicated everything.  Each new one
> > is supposed to be the greatest thing since
> > sliced bread - one's thesis said that.  But it
> > rarely works out that way.
>
> Ech...You're only hearing about the languages that are trumpeted and get
> a lot of chatter.  There's more than just those out there, believe me.
> I was surprised to hear that someone on this list uses R.  I've seen
> plenty of R code, but I hadn't noticed anyone talk about it outside
> Rosetta Code.
>
> >
> > In the not so distant past it was clear that
> > if you wanted a program to execute 10X faster,
> > buy hardware that went 10X faster.  Concepts
> > like caching have been around a long time,
> > but only recently has the hardware been
> > cheap enough to routinely support them.
> >
> > I'll agree that speed is both program and
> > context dependent, and that one usually
> > only has control over a few of the factors.
>
> It's perfectly plausible to have control; It's a matter of managing your
> environment.
>
> True.  But that depends on how big
your project is, and of what commercial
importance.  A single person working on
a solution to local, if important, problem
can get away with a lot of mix and match.
A big project - say new avionics for an
F-18 fighter - has other constraints.  It all
has to work right, and a lot of people have
to be proficient with the tools used.

It's a management issue all right, and
it's easy to see that approaches often
do not scale well.

//endjob

   -- Bob


>
> > On Mon, Feb 1, 2010 at 10:36 AM, Michael Mol <mikemol at gmail.com
> > <mailto:mikemol at gmail.com>> wrote:
> >
> >     On Mon, Feb 1, 2010 at 10:01 AM, Bob Kline <bob.kline at gmail.com
> >     <mailto:bob.kline at gmail.com>> wrote:
> >      > Uhmmm,  isn't execution speed and
> >      > coding speed the usual tradeoff with
> >      > high level languages?  A shell script
> >      > can get small things done in a hurry.
> >      > No one expects it to execute fast.  Or
> >      > should anyway.
> >
> >     True, to an extent, but some will perform better than others when
> >     given the exact same instructions. Keep in mind is that different
> >     languages have different ways to let you reach the same end
> >     efficiently. Taking advantage of language idioms will go a long way
> in
> >     making a "slow" work better.
> >
> >     I won't pretend to be able to back up the point with specific
> >     examples, though. I'd have to be an expert in each language.
> >
> >      > Isn't it usually the case that one
> >      > needs a compiled version of high
> >      > level code before the speed improves?
> >
> >     No. Speed improvements usually occur in a few stages (in no
> >     particular order):
> >
> >     * Throw more/better hardware at the problem.
> >     * Take advantage of caching in more places
> >     * Refactor the code to meet different design requirements.
> >     * Refactor the code because you learned how to better write to the
> >     langauge
> >     * Change your execution environment.
> >
> >     There's a *lot* of gain that can be gained from those first four, and
> >     by the time you can refactor to write to the language, your market
> >     value probably doubled compared to when you first started writing in
> >     that language professionally.
> >
> >      > As in an order of magnitude and more?
> >      > High level languages keep people
> >      > from having to learn things like assembly
> >      > language and "C,"  reduce expensive
> >      > labor costs, and exploit cheaper, faster
> >      > hardware, but I'd of thought that it was
> >      > clear what the price of them is.
> >
> >     Writing in a more expressive ("higher level") language improves labor
> >     costs because it allows you to decrease your iteration time in
> >     development.
> >
> >      > They are relatively slow. You never get
> >      > it all.
> >
> >     You never get it all with any language. It's a matter of looking at
> >     the task at hand, and choosing the right tool for the job. Would you
> >     write a wiki in C, much less assembler? I wouldn't even *try* it in
> >     C++ (the language I'm most proficient in) until I had a few more
> years
> >     of professional development under my belt, and I'd probably be smart
> >     enough to know better by then.
> >
> >     --
> >     :wq
> >     _______________________________________________
> >     grlug mailing list
> >     grlug at grlug.org <mailto:grlug at grlug.org>
> >     http://shinobu.grlug.org/cgi-bin/mailman/listinfo/grlug
> >
> >
> >
> >
> > _______________________________________________
> > grlug mailing list
> > grlug at grlug.org
> > http://shinobu.grlug.org/cgi-bin/mailman/listinfo/grlug
>
> _______________________________________________
> grlug mailing list
> grlug at grlug.org
> http://shinobu.grlug.org/cgi-bin/mailman/listinfo/grlug
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://shinobu.grlug.org/pipermail/grlug/attachments/20100201/66f1a95d/attachment-0001.htm 


More information about the grlug mailing list