6.23.2005Stumbled across a quote from Norbert Wiener (1964) in the essay "The ethics of autonomous learning systems" by A. F. Umar Khan
"The gadget-minded people often have the illusion that a highly automatized world will make smaller claims on human ingenuity than does the present one and will take over from us our need for difficult thinking, as a Roman slave who was also a Greek philosopher might have done for his master. This is palpably false. A goal-seeking machanism will not necessarily seek our goals unless we design it for that purpose, and in that designing we must foresee all steps of the process for which it was designed, instead of exercising a tentative foresight which goes up to a certain point, and can be contained from that point on as new difficulties arise. The penalties for errors of foresight, great as they are now, will be enormously increased as automation comes into its full use."
Note this implicit suggestion that if we dont design the machines for our own goals and instead let the machine construct and persue its own, we are freed from the obligation of foresight.