Software Nerd

Saturday, February 26, 2005

Computers in Elementary School

Should elementary kids learn computers? Some say that computers cannot help and might even be distracting, at best. Others say that in this age of computers we need to start our kids off early.

Its fine for elementary kids to learn how to use a computer for something other than games, but its far-fetched to suggest that it will help them use computers later. Using the typical tools -- word-processors & spreadsheets -- takes very little training.

On the other hand, I'm not a complete Luddite about this either. I think that some well-constructed teaching programs can be useful. They are particularly useful in drill. There are many things that a child learns by mental drill -- repetition.

Here's one example from the life of my 6 year old. He's learning certain "math facts". Having past the stage of being able to add number by counting, he is now being taught to memorize some. Presented with any pair of single digit numbers, he should be able to recall their sum, difference and product, without having to mentally add, subtract or multiply them.

He was stuck on the memorizing the products. His "7 times tables" were hard, and so were 8 and 12. Every day, I would quiz him on two or three pairs. In a spare moment, I might ask: "So, what is 7 times 8". After about a week, I found that he was able to remember more. So, I wrote a small software program to:
  • display a "problem", like " 7 x 8 = ??";
  • accept an answer; and,
  • say if it was right or wrong.

It also displayed a score at the top of the screen, "You got 5 right out of 6". I randomized the selection of numbers that would be presented. I also made it log each question and whether it was answered right -- that way I could review it and see if he was having trouble with certain specific pairs (why is "7 times 8" so hard?).

It worked! He has been able to drill at odd moments, and without my being there to help him. The drill has improved his retention of the "Math facts". He even finds the process mildly interesting.

So, yes, a computer can be a useful tool, if applied to the right type of educational problem. Not the computer as something to learn, but the computer as a tool to help teach something else.

Friday, February 25, 2005

Base 10: Is it all its made out to be?

"Why don't you folks use Kilometres?" asks the European.
"In India, we've been using kilograms for decades", says another.
The advantage of metric systems are two-fold:
1) Easier to remember for folks who compute using a Base-10 number-system
2) Uniformity of terms, across incommensurables (e.g. "kilo-metre", "kilo-gram", "kilo-byte", "kilo-whatever")

So, the question is: Is Base-10 a good system?

Why are eggs still sold by "the dozen"? History.
Why were so many things sold by "the dozen"?
For one thing, it makes it easy to get "a quarter", "a third", or "half". If things were sold by "the tens", the green grocer could give you "a half", but couldn't give you "a third" or "a quarter" without cutting up some fruit and vegetables.

Here are two interesting links:

The first one is the British Weights and Measures Association explaining some advantages of older systems.

The next is an article that goes a bit deeper into the mathematics of usability and concludes that base 12 is a pretty good system.

Ofcourse, having two systems can be worse than having one bad one. Remember the Mars spacecraft that crashed because two modules were using different systems and did not "know" it.

Now, let's make the calendar metric and give people born on Feb 29th an annual birthday please!

'De-skilled' Software Development

Computer science (CS) is all about logic and algorithms, right?

Then, why do software development managers want to remove the thinking from software development?

I understand where they're coming from:
1) Automation (less people making more stuff) brings margins .... PROFIT$.
2) Automation brings consistency of product... quality, quality, quality.
3) Managers hate being dependent on the "geeky software-nerd types".

All this is "great in theory". Happily (for me), it defies reality. Can we ever have software-development assembly lines in "software factories"? There are places that go by that name, but even the die hards will admit it isn't like producing cars or burgers...where each one is like the other. (A different color, a different topping and you're done. Cars $40 an hour, burgers $7.50 an hour.)

The place that really is like an assembly line for software is the CD/DVD duplication facility. Each product sold is identical and manufacturing costs a pittance... but, that's not what the managers want to automatize.

Can you automatize the design of a car, a plane...any complex product? Software development is not just a production activity. It involves design and production.

A paradox: as more tools are made available to automate repetitive tasks, what remains to be done has a ever higher proportion of skill and knowledge. Automatizing some of software development makes it more difficult to automatize the rest. [You can automate parts of the draftsman's job... you're left with proportionately more engineering... and that's a good thing..]

Thursday, February 17, 2005

Skills without knowledge

Thus far, I have not ranted in this blog... what good is a blog without a rant. So, here goes.

Many software practioners have the skills to use today's tools, but do not have the knowledge that underlies those skills.

Someone who has spent 5 or 6 years in software may wonder if I'm simply a tired old curmudgeon. I know how you feel. When I started my career, armed with the silver bullet of my day, the old fogies exasperated me. It was up to me, I thought, to teach them modern software development paradigms (that's a word that has endured).

No, I do not "pooh! pooh!" new fangled ideas. I don't say:"I could do the same thing faster in COBOL". Yes, there are some young ones -- too few -- who understand how new technology is always evolving from old. But the kids who say "things are competely different now, with technology XYZ" are just as ill-informed as my own age-cohorts who claim "its all old wine in new bottles".

So, for those who need a history lesson, here are some nuggets:

  • Java is great, but the idea of translating a high-level language into byte-code is an old one. Pascal had "P-code". And IBMs "Virtual Machines" are an old concept.

  • Object oriented programming is at least 20 years old.

  • Working around HTTP's "stateless-ness" is a retrograde step. This is what one did in IBMs CICS. We called it "pseudo-conversational" then.

  • Object oriented databases and XML databases (sic) might store data in the way its typically used, but Dr. Codd spent his life showing why this was a bad idea. He managed to take the industry away from heirarchical and network databases. Let's not go back.

  • Extreme programming ... that's a subject for a rant of its own...

  • Client-server user interfaces are much more user friendly that web interfaces (did I hear someone say "drag and drop")

  • Those who do not learn from history are doomed to repeat its mistakes.

    Sunday, February 13, 2005

    Advances in Software

    Hats off to hardware improvements -- spectacular developments in processor speed and memory. These have made today's software possible.

    Software has had its share of brilliant people. In a previous post, I mentioned Donald Knuth, Edsger Dijkstra and E.F.Codd. They helped build the theory of software development. Others have contributed great tools.

    Some have been revolutionary. Dan Bricklin's idea, the spreadsheet, required a leap of imagination: a fundamentally different way of thinking about the user interface.

    Others, like wordprocessors all the way to ERP applications have been evolutionary, with each successive "generation" adding more functions.

    What would be the top 10 ideas in software over the years?

    The history of Software

    In the 20 years since I started writing programs, I've seen great changes in software. The capabilites of today's programming languages, databases and other tools let me do things I could never have dreamed of doing before. I remember a pre-Windows system where a young colleague developed something like the list-boxes of today. Only, there were two problems:
    • it was slow; and,
    • it required a lot of work.

    We lost money on that project and the client was unhappy about the sluggish system. Finally we re-developed it entirely from scratch, without the bells and whistles: it had been an idea whose time had not yet come.

    Today, we take something like a list box for granted. Like any old fogey, I sometimes worry that young programmer do not appreciate the history of software development. I see examples where people repeat the errors of history.

    In the right context, its fine to say "Wouldn't it be cool if we could invent a steam engine that runs on coal." But, if you do so without knowing that steam engines existed...

    So many ideas have contributed to software-development. A lot of these ideas were in the area of algorithms. Take the ordering of data as an example. Is there a best (i.e. fastest) way to sort data? How about searching for data? Donald Knuth is famous for his work on algorithms. Thanks to the late Edsger Dijkstra , programming without GOTO statements is something today's students of programming consider "obvious". Today's ubiquitous relational databases owe their origins to E.F.Codd (though he was disappointed with what others did to his creation).

    The University of Pittsburg Hall of Fame is a good place to see a list of people who contributed to computers. Every serious student of software must know something about some of them. Understand what was the "before scenario" and what change they contributed.

    It's been a fascinating journey, and it has a long way to go.

    My Introduction to Software

    Software-development is my primary interest, my work, and my play.

    In 1984, ignorant about software, someone told me that in the computer language called BASIC, one could write: A = A + 1

    Sacrilege! How can something be A and non-A at the same time?

    A few months later I took my first mandatory BASIC programming course and realized that "=" did not mean "is equal to" in the context A = A + 1. Not in the sense of an assertion. Rather, it means, "make A equal to A + 1". To put it more simply: "Add 1 to A".

    Programming was like solving puzzles. "Write a program to print the first 100 prime numbers." Hmm! Interesting.

    That's how I was hooked.