Tuesday, April 21, 2015

The Need for Long-run Future Planning

There are many problems looming in the future that seem not to be considered by the people who run things. Why is that?

I saw a TV ad today of a car-building assembly line manned by a bunch of robots. I got the impression that the makers of the ad thought that we, the audience, would by impressed or filled with admiration for the efficiency of the robots. Well, I have news for them. The reaction it provoked in me was to wonder how are the people who used to do that supposed to make a living now? Should they cash in their pension to go back to school at the age of fifty or fifty five and learn computer programming? A programmer I know got laid off as her company shipped her job to Asia.

The owners of the car business think their job is to make the biggest profit they can for their shareholders (themselves, included). O K, that is the theory of capitalism. Competition inspires people to be continuously as efficient as possible. Anyone who loses his job in the process should either find a new one or create a new business to become a capitalist himself. Fine; but if a million people are thrown out of work, does anyone believe there would be a need for a million new businesses, even imagining for a moment that every one of those laid off workers could be that creative.

The attitude comes, I believe, from a perfectly natural way of thinking. Back in prehistory, when the population of the earth was small and scattered, people dropped their waste outside their usual precincts and the environment disposed of it. They did not have to deal with plastics that are unnatural in the sense that some of them never deteriorate.

So, humans can no longer trust the environment to return their waste to raw material. But no one has come up with a definitive solution for that problem. Instead different “interest groups” fight over who should take responsibility to get rid of such waste products and who should pay for it..  As the debate goes on the problems grow exponentially worse.

So now we see two growing problems—the accumulation of undigested waste around the world, and the accumulation of unemployment that will never by reduced as robots are employed to replace humans on a continually increasing pattern. There was an article in Sunday’s New York Times, “The Machines are Coming,” that described how the old solutions—where laid-off workers learn new, more complex skills—are no longer working. I believe we need something much more fundamental: a revolutionary attitude that getting more jobs is not the answer. The world simply does not need all the available workers, so why not agree that the machines should provide all the needs of living, and humans consume them freely and spend their (our) time on creative activities—art, science, song and dance. I read somewhere that the Polynesians lived like that. It required a couple hours a day to supply their needs—that were furnished by their lush natural environment—and spent most of their time adventuring or creating.

So why not?

Of course, there is a third issue—the one that might make the others irrelevant: global warming. If we are indeed approaching a deadline beyond which the earth will become uninhabitable it is fair to ask the legislators who are preventing doing anything about it WHY?

1 comment:

  1. There are several interesting ideas here, I'll address one, and it will also address elements of your previous post on robots.

    It seems like we need to decide some things first. What are our fears around robots? What are robots, anyway? It seems to be that a robot can be defined at it's simplest as a device of minimal complexity created to perform repetitive tasks, and with greater quality and reliability than a human. Well, as a Capitalist society, this seems pretty good, right? Businesses save labor costs and have higher quality goods. Workers loose their jobs and move on to some other endeavor better suited to human intelligence. Can anyone really argue that a machine that replaces a man applying rivets on assembly line is a bad thing? I might go so far to say that tasks like this are not worthy of the attention of such a magnificent machine as a human. Other than, figuring out how to get them done in the most efficient way.

    Now, a robot can also refer to something much more complex. My guess, and I am predated by many Science Fiction writers, is that robots will eventually become sentient beings. Probably more advanced than humans! This is where it gets scary. As robots develop, we will use them for more and more complex tasks. I think it might approach an analogy like slavery. At some point they will become self-sustaining and self-maintaing. This will be followed by self-designing. I wonder if, at the point that they are designing and improving themselves, we can start to think of them as evolving? Here's where it gets really interesting. At some point, these beings will eclipse their human "creators" in every aspect. They will evolve at a much faster rate than the physical limitations of our species will allow. They will be stronger, faster, learn unlimited amounts, live longer. In fact, they could conceivably be immortal.

    So, uh, where does that leave us? Should we build in control systems that limit them and keep us on top of the food chain? The visionary wrier, Isaac Asimov thought about this and proposed "The Three Laws of Robotics". They are, as taken from Wikipedia. Link:

    http://en.wikipedia.org/wiki/Three_Laws_of_Robotics

    1.) A robot may not injure a human being or, through inaction, allow a human being to come to harm.

    2.) A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law.

    3.) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

    Or, do we allow their development and watch with passive fascination as they decide it's no longer fun cooking our meals and driving our kids to school! This is really coming. This is what scares me!

    ReplyDelete