Robot slaves - do we have the right?

Started by Unbeliever, December 15, 2015, 06:06:06 PM

Previous topic - Next topic

CloneKai

And i don't think we will be making intelligent toasters and vacuum cleaners  :rolleyes:
if we make intelligent stuffs. they will probably not be slaves. second class citizen sure but no point in wasting money, creating something so complex and then use it to get groceries or stuffs.

SGOS

Quote from: CloneKai on December 17, 2015, 05:50:27 AM
And i don't think we will be making intelligent toasters and vacuum cleaners  :rolleyes:
if we make intelligent stuffs. they will probably not be slaves. second class citizen sure but no point in wasting money, creating something so complex and then use it to get groceries or stuffs.


Have you seen Ex Machina?  This is a movie directed at precisely the questions you bring up here.

The Skeletal Atheist

I think my toaster is on the fritz, it keeps on screaming "kill all humans".
Some people need to be beaten with a smart stick.

Kein Mehrheit Fur Die Mitleid!

Kein Mitlied F�r Die Mehrheit!

doorknob

Quote from: stromboli on December 16, 2015, 09:53:56 PM
You are assuming that we build something and give it the capacity to become an independent thinker and then don't let it? If something is designed and built to a purpose it is used for that purpose. Building something to all on its own become independent, would imply the understanding that it would do so.

The best analogy I've seen is Bladerunner. But there, the androids that turn on humanity are androids bred to specific uses, that gave them first of all the ability to kill. If you do not give an artifact the ability to kill and/or become an independent decision maker, it won't. If you build one to do that, then the possibility is implied in the building of it.

The idea that we will build a robot and then-whoops- it becomes sentient- is silly. Why would you build a device to perform a task and then give it the ability to critically think its way into believing it was the equivalent of human? If you did give it the ability, then that action is anticipated and expected. The idea that a mechanical or biomechanical artifact can suddenly evolve without anyone expecting it is nothing but supposition.

You are giving humans far too much credit. I think we would and already are giving technology the ability to learn and think on it's own. Yet we are still giving this technology a job. Did you not read the article. Maybe Winston isn't fully aware yet and it will be hard to measure how much understanding a machine has but it could very easily get out of control. I mean this thing is storing it's self in the cloud now. It would be virtually unstoppable if it develops thought of its own.

It's hard to say weather androids will deem us worthy or killable. After all humans art a pest. We destroy the planet that alone is probably enough to exterminate us. But then again they may very well have a higher moral standard that is beyond us. It's yet to be seen.
I'm not anti android by any means what will happen will happen. But a sentient being will be built in time and it will be abused by humans mark my words.

stromboli

OTOH, if they finally create a female android that looks like, acts like and smells like a female that puts out willingly, what the hell- think of the money saved on dates and shit. Same with guy versions. so much for the birth rate.

CloneKai

Quote from: SGOS on December 17, 2015, 06:17:19 AM
Have you seen Ex Machina?  This is a movie directed at precisely the questions you bring up here.
no, i haven't . i should check it out

Sal1981

I recommend the latest videos by Computerphile, which he has done with a computer researcher that's specialized in A.I.

https://www.youtube.com/watch?v=7PKx3kS7f4A

https://www.youtube.com/watch?v=tcdVC4e6EV4

https://www.youtube.com/watch?v=5qfIgCiYlfY

There are more, just search his channel on YouTube.


My 2 cents about A.I. are unless we're careful it is gonna be the next situation and dangers involved like we had with nuclear energy, when it first became a reality. Except for atomic energy, it's gonna be about intelligence.

I'm honestly quite bleak about the future of A.I. and to me it seems there are many pitfalls if we get it wrong.

Unbeliever

God Not Found
"There is a sucker born-again every minute." - C. Spellman

CloneKai

Quote from: SGOS on December 17, 2015, 06:17:19 AM
Have you seen Ex Machina?  This is a movie directed at precisely the questions you bring up here.
i just watched it.
nice movie. but what question were you talking about

SGOS

Quote from: CloneKai on December 17, 2015, 06:21:39 PM
i just watched it.
nice movie. but what question were you talking about

Sorry, I thought this was your thread, but I just checked and it's not.  So I guess I was talking about things other people had brought up, like the morality of the treatment of sentient robot slaves.  In Ex Machina, the creator claimed to be developing AI, but at the level he was working, it suggests he was creating AI that was perhaps beyond some definitions of "artificial."  He was using them to do his bidding, terminating them and creating newer prototypes when it was convenient, not the way we humans would want to be treated, and when AI gets that sophisticated, it creates consequences.

I'm happy you liked that movie.  I thought it was a thought provoking film.  It's near the top of my favorites list for the year. 

Johan

This makes for a fun thought experiment and all. But its not at all the question we ought to be asking right now.

Will we someday produce a machine that has genuine free will ability? Perhaps. But I think that day is MUCH further out than the day when the vast majority of menial labor jobs as well as a large percentage of skilled labor jobs have been replaced by machines.

So a much better question to be asking right now is, what will the economy look like when we have the ability to replace 99 out of every 100 jobs with machines. There will be a day in the not so distant future when the Walmarts of the world will be able to function at near 100% productivity with 1/100th the number of people required today. And I'm not just talking about the Walmart stores themselves. I'm talking about the entire supply chain. There are warehouses right now today that would have required 100 or more people to run 10 years ago and now maintaining the same level of productivity with a staff of less than 10. Granted those facilities today are the exception rather than the rule. But they exist today and their numbers are going to grow.

We're on the verge of cars that drive themselves. That's going to happen in the near future. When it does, trucks that drive themselves won't be far behind. By the time that happens, those vehicles will also have the ability (and probably the requirement) to diagnose their own failed components far more quickly and accurately than any human could. Therefore it won't be much or a stretch to imagine that a shop which today might require as staff of 25 skilled techs will be able to operate in the future with maybe 4 techs and eventually with 1 and then eventually with none.

Walmart and their ilk will not hesitate to eliminate 99 out of every 100 jobs the moment they are able to do so. But who will be able to shop there at that point? That is the question we need to be asking. When we get to the point where you can effectively run 10 separate Olive Garden locations with a staff of 1 person (and we will), who will be able to afford to eat there?
Religion is regarded by the common people as true, by the wise as false and by the rulers as useful

CloneKai

#41
Quote from: SGOS on December 17, 2015, 07:39:23 PM
Sorry, I thought this was your thread, but I just checked and it's not.  So I guess I was talking about things other people had brought up, like the morality of the treatment of sentient robot slaves.  In Ex Machina, the creator claimed to be developing AI, but at the level he was working, it suggests he was creating AI that was perhaps beyond some definitions of "artificial."  He was using them to do his bidding, terminating them and creating newer prototypes when it was convenient, not the way we humans would want to be treated, and when AI gets that sophisticated, it creates consequences.

I'm happy you liked that movie.  I thought it was a thought provoking film.  It's near the top of my favorites list for the year. 
the guy was kinda crazy. but in the initial stage, things like this will happen with AIs. it will take a while and be in the public domain before they will get any sort of real and stable place in society.

the consequences the movie showed (atleast for the human) were kinda silly and will only happen if the makers were extremely dumb. in the development stage, the AI has no reason to be able to survive in a real world. let alone be able to harm a human.

but yeah, there were some good thought provoking stuffs there.
:think:

CloneKai

Quote from: Johan on December 17, 2015, 08:20:02 PM
This makes for a fun thought experiment and all. But its not at all the question we ought to be asking right now.

Will we someday produce a machine that has genuine free will ability? Perhaps. But I think that day is MUCH further out than the day when the vast majority of menial labor jobs as well as a large percentage of skilled labor jobs have been replaced by machines.

So a much better question to be asking right now is, what will the economy look like when we have the ability to replace 99 out of every 100 jobs with machines. There will be a day in the not so distant future when the Walmarts of the world will be able to function at near 100% productivity with 1/100th the number of people required today. And I'm not just talking about the Walmart stores themselves. I'm talking about the entire supply chain. There are warehouses right now today that would have required 100 or more people to run 10 years ago and now maintaining the same level of productivity with a staff of less than 10. Granted those facilities today are the exception rather than the rule. But they exist today and their numbers are going to grow.

We're on the verge of cars that drive themselves. That's going to happen in the near future. When it does, trucks that drive themselves won't be far behind. By the time that happens, those vehicles will also have the ability (and probably the requirement) to diagnose their own failed components far more quickly and accurately than any human could. Therefore it won't be much or a stretch to imagine that a shop which today might require as staff of 25 skilled techs will be able to operate in the future with maybe 4 techs and eventually with 1 and then eventually with none.

Walmart and their ilk will not hesitate to eliminate 99 out of every 100 jobs the moment they are able to do so. But who will be able to shop there at that point? That is the question we need to be asking. When we get to the point where you can effectively run 10 separate Olive Garden locations with a staff of 1 person (and we will), who will be able to afford to eat there?
a lovely welfare system i suppose.
keeping people happy is good for economy and for the rich and powerful people.
i always thought, that as a human society, we shouldn't need to work to live. rather work to become better or for our own interests.
food and necessities should come free.
and then normal human population should go wall-e world route. unless we indoctrinate people to be ambitious.

Shiranu

Quotei always thought, that as a human society, we shouldn't need to work to live. rather work to become better or for our own interests.
food and necessities should come free.

The ape (I mean that as a statement about man and not you) thinks itself far more important than it truly is.

We were bred to work by hundreds of millions of years of evolution... this idea that we shouldn't need to work to live and have the ability to make it so is younger than our average life span. I am all for work becoming easier and necessities coming free... but any luxuries (more than the most basic of food or roof over your head) should require work because...again... that's what we were bred for. One day these support system will fall and if humanity is so dependent on them... what then?
"A little science distances you from God, but a lot of science brings you nearer to Him." - Louis Pasteur

CloneKai

#44
Quote from: Shiranu on December 18, 2015, 04:06:58 AM
The ape (I mean that as a statement about man and not you) thinks itself far more important than it truly is.

We were bred to work by hundreds of millions of years of evolution... this idea that we shouldn't need to work to live and have the ability to make it so is younger than our average life span. I am all for work becoming easier and necessities coming free... but any luxuries (more than the most basic of food or roof over your head) should require work because...again... that's what we were bred for. One day these support system will fall and if humanity is so dependent on them... what then?
who are you calling an ape  :38:

we were bred to do many things. but society expect us to behave, a little differently, from, how our ancestors used to behave million or hell even a thousand years ago.
a luxury like going somewhere and eating out once a month, would make a lot of people happier. it is not really a luxury in my opinion. its a necessity for a happy life (well most of the time). and happiness for the majority of people would make much more stable and happy society.
luxury like owning a yacht or plane. yeah, work for that. but while owning car could be considered luxury, in that case the public transport should be so good that one shouldn't really require it. 
in Germany, owning a car or a bike is a luxury like.
in Pakistan, its a basic necessity. could we travel without cars there, sure, but i will eventually go on murderous rampage. and then PR will will come here and you know ... :confused:

Quote from: Shiranu on December 18, 2015, 04:06:58 AM
One day these support system will fall and if humanity is so dependent on them... what then?
if the system fall, we will have serious problems today too. we will just have to have some backup plans. the system we have today isn't perfect either so i think we can manage.