Atheistforums.com

News & General Discussion => News Stories and Current Events => Topic started by: Hydra009 on December 30, 2016, 04:05:29 PM

Title: Driverless cars and morality
Post by: Hydra009 on December 30, 2016, 04:05:29 PM
(http://img.koreatimes.co.kr/upload/newsV2/images/160314_p02_ethical.jpg)

I found an interesting story with a needlessly provocative title, self-driving cars are already deciding who to kill (http://www.sfgate.com/technology/businessinsider/article/Self-driving-cars-are-already-deciding-who-to-kill-10825168.php).

QuoteAutonomous vehicles are already making profound choices about whose lives matter, according to experts, so we might want to pay attention.

"Every time the car makes a complex maneuver, it is implicitly making trade-off in terms of risks to different parties," Iyad Rahwan, an MIT cognitive scientist, wrote in an email.

The most well-known issues in AV ethics are trolly problemsâ€"moral questions dating back to the era of trollies that ask whose lives should be sacrificed in an unavoidable crash. For instance, if a person falls onto the road in front of a fast-moving AV, and the car can either swerve into a traffic barrier, potentially killing the passenger, or go straight, potentially killing the pedestrian, what should it do?
Sam Harris has been talking about just such a dilemma on his podcast.  When people were asked whether the driverless car should prefer to save its driver's life or the pedestrian's life if there was no other option, people generally said that there should be no preference - that the car should value both lives equally and shouldn't prefer one over the other.  But when people were asked which car they would rather buy - a car that's more likely to save the driver's life or a car that's more likely to save the pedestrian's life - they overwhelmingly went with the car that saves the driver.

It seems like the first question had a very definitive answer which was masked by a response bias.  Revealing one's true position on the first question (preferring the car to save the driver and kill the pedestrian instead) would've come across as selfish or uncaring of others, likely biasing respondents to claiming a neutral position that they don't actually hold.

The emerging technology of AI is going to be very fascinating issue by how much it reveals about humanity.  Put a dozen ethicists in a room and they might debate ethical questions like the trolley problem endlessly.  But put a strong AI in charge of a city's traffic and these seemingly unsolvable moral quandaries get resolved surprisingly quickly.

Another issue would be the risks associated with an experimental new drug starting human trials.  People could die if the drug doesn't perform well or has unforeseen adverse reactions.  Yet people will definitely die from an untreated life-threatening condition.  Who has the moral solution to that conundrum?  Someday, an AI might be the one making that call.

And if we succeed at developing an strong AI that's also ethical, it might be better at handling these hard choices than us humans.  After all, a computer won't be able to appeal to God or tradition or intuition.  A robot judge can't dole out harsher sentences because it's cranky or upset or bigoted.  That's already a substantial improvement, imo.
Title: Re: Driverless cars and morality
Post by: Johan on December 30, 2016, 05:26:58 PM
I've seen where this question has been coming up over the past year or so. IMO its largely a pointless discussion and most of the people putting effort into debating it need to find something more productive to focus their attention on. Because everyone debating this topic seem to completely forget or ignore one incredibly important fact. Human beings absolutely positively suck at driving. Period.

We're so god damn worried about these rock and hard place trolly scenarios that will almost never manifest themselves. Meanwhile in the US alone, 100 people lost their lives in car accidents today and another 6000+ were injured or disabled. And another 100 will die tomorrow. We're talking about a technology that could realistically take those kinds of daily numbers and turn them into the annual numbers and you want to get all up in my grill about well what if there's this bizarre situation where there's a old person standing in the other lane and a bus load of orphans in front of you and a dynamite factory on the other side and the car has to decide which one gets hit? How about fuck you. Fuck you in the neck. Who fucking cares which one its programmed to hit? If we're talking 100 people a year vs 100 people a day, program the car to kill them all in that scenario and we'd still be better off.

BTW Hydra, I wasn't saying fuck you in the neck to you. That was aimed at those who seem to get so up tight about how we absolutely must work these questions out.
Title: Re: Driverless cars and morality
Post by: Hydra009 on December 30, 2016, 05:56:11 PM
Quote from: Johan on December 30, 2016, 05:26:58 PM
I've seen where this question has been coming up over the past year or so. IMO its largely a pointless discussion and most of the people putting effort into debating it need to find something more productive to focus their attention on. Because everyone debating this topic seem to completely forget or ignore one incredibly important fact. Human beings absolutely positively suck at driving. Period.

We're so god damn worried about these rock and hard place trolly scenarios that will almost never manifest themselves. Meanwhile in the US alone, 100 people lost their lives in car accidents today and another 6000+ were injured or disabled. And another 100 will die tomorrow. We're talking about a technology that could realistically take those kinds of daily numbers and turn them into the annual numbers and you want to get all up in my grill about well what if there's this bizarre situation where there's a old person standing in the other lane and a bus load of orphans in front of you and a dynamite factory on the other side and the car has to decide which one gets hit? How about fuck you. Fuck you in the neck. Who fucking cares which one its programmed to hit? If we're talking 100 people a year vs 100 people a day, program the car to kill them all in that scenario and we'd still be better off.
Yeah, that was one of the comments on reddit when this article was posted there.  Even if the AI killed everyone in every trolley problem it encountered, it'd still be a better outcome than the current situation of humans routinely plowing into each other, so all this worry about how driverless cars would operate in trolley problem situations is foolish.  I'm inclined to agree.

Still, it'd be fascinating to see how people's moral theories (and the moral intuitions behind them) stack up when it's time to actually implement them.  Would we prefer a car that saves its occupants at all costs?  How about a car that avoids bigger objects when possible, resulting in hitting smaller objects, possibly splatting a small animal to avoid hitting a truck?  These are questions for wise men with skinny arms.
Title: Re: Driverless cars and morality
Post by: Johan on December 30, 2016, 08:21:36 PM
You forgot pasty skin. Skinny arms and pasty skin.

It is a somewhat interesting topic to explore in terms of social norms and whatnot. But its such a complex subject and the answers involve technology that is so amazingly complex, its really far beyond what most laymen can realistically comprehend I think.

I think one of the biggest factors that most everyone glosses over is the fact that flawed though it is, the human brain has an incredible ability to store an image of a 'perfect object' and is then able to almost instantly identify an infinite number of different perfect and non-perfect versions of that 'perfect object'. In other words, we can look at a picture with a person standing next to garbage can and nearly instantly and effortlessly deduce which is which even though we've never before seen that particular person or that particular garbage can.

Computers look at that same image and see two objects, one (presumably) taller than the other. It must then compare each to millions of other stored images to try to figure out what exactly each object is.

So for example lets say that someone cleaned out the garage placed an old department store mannequin out by the curb for garbage pickup. Lets say this mannequin has one leg and one arm missing. Now lets say you're driving your good old fashion '72 Dart down that street and a kid runs out in front of you. You being human, would almost automatically know swerving and possibly hitting that mannequin is the far better option to continuing on and possibly hitting the kid.

But a computer could potentially see that mannequin as a disabled person who is less able to get out of the way than the obviously able bodied person running directly in front of you because its incredibly difficult for a computer to instantly tell the difference between a person and mannequin.

But then again if we're being realistic we'd need to remember that the autonomous car will have radar and other sensors that will make it able to 'see' that kid running toward the street long before the human inside the car was in a position to actually see the kid. Therefore the autonomous car would have theoretically began applying the brakes before the kid was ever in view thereby opting to hit neither rather than having to choose which is the better option to aim for.

And yeah, there will most definitely still be situations where car isn't able to take early action to avoid danger and will then have to instantly decide which 'thing' it should steer into. And yeah, people will definitely get hit, injured or killed as a result. But again when we get right down to it, I think we're talking about killing a small handful per year while at the same time saving literally thousands of others not only from death but also from any kind of injury. In the end the answer is still I don't care which way the engineers make the car choose, just get it done ASAP.
Title: Re: Driverless cars and morality
Post by: Baruch on December 30, 2016, 08:52:53 PM
If human beings suck ... then just kill them.  Don't let them drive, don't let them vote either.

Y'all aren't talking realistically about driverless cars and morality.  Driverless cars aren't human.  Also driverless cars, for insurance purposes, will not be owned by ordinary individuals.  They will mostly be government and corporate fleet vehicles (see the rush for driverless trucks).  Mexican drivers in Mexican trucks aren't cheap enough.  I can't wait for driverless Domino pizza drones too.  Pizza Blitz over London!  Morality never never applies to human organizations ... if anyone bothered to examine them.  Morality only applies to individual human beings, and is usually used to oppress and kill them (by the assholes who are defining what is or is not moral).
Title: Re: Driverless cars and morality
Post by: Hydra009 on December 30, 2016, 09:23:29 PM
Quote from: Baruch on December 30, 2016, 08:52:53 PMDriverless cars aren't human.
(https://media.giphy.com/media/Hq4DYXhDkEvHW/giphy.gif)
Title: Re: Driverless cars and morality
Post by: Hydra009 on December 30, 2016, 09:43:18 PM
Quote from: Johan on December 30, 2016, 08:21:36 PMI think one of the biggest factors that most everyone glosses over is the fact that flawed though it is, the human brain has an incredible ability to store an image of a 'perfect object' and is then able to almost instantly identify an infinite number of different perfect and non-perfect versions of that 'perfect object'. In other words, we can look at a picture with a person standing next to garbage can and nearly instantly and effortlessly deduce which is which even though we've never before seen that particular person or that particular garbage can.

Computers look at that same image and see two objects, one (presumably) taller than the other. It must then compare each to millions of other stored images to try to figure out what exactly each object is.
Yeah, human pattern recognition is amazing.  A few years ago, I got involved with a Galaxy Zoo, a crowdsourced project where John Q Public identifies galaxy types from photographs of galaxies taken by a robotic telescope - the draw for people is that they get to boldly see galaxies that have never been seen by humans before.  Apparently, humans were/are better at categorizing galaxies than computers.

But I've been hearing for years that computers are making strides in image recognition.  Given any rate of improvement, they'll eventually catch up.  It'll definitely be an interesting development when they do.
Title: Re: Driverless cars and morality
Post by: Johan on December 30, 2016, 10:33:15 PM
True enough. And I suppose its also important to remember that these systems could be made to see or sense things that people cannot. For instance mannequin would look very different from an actual person on a thermal imaging camera. People and animals also have an electrical signature of sorts that a machine could theoretically be made to sense. And with enough sensors and enough procession power, its reasonable to believe these machines could, in a fraction of a second create an accurate map of every living thing within a 500' radius and also be able to predict within a second or two whether any of those living things have the potential to create a conflict.
Title: Re: Driverless cars and morality
Post by: Baruch on December 31, 2016, 07:25:16 AM
Tomorrowland fantasy ... ruled by neb-lib SJWs?

https://www.youtube.com/watch?v=lNzukD8pS_s

"The girl says ... I felt anything is possible" ... aka I thought with my emotions, not with my brain.  I watched it when it came out, in the theater.  Enjoyed it.  But Disneyland is for children.  The future is always a dystopia, which kind depends on which stupid ideology is in charge.   Folks simply don't get the irony of this movie ... the button she held was an advertising gimmick.  AI is an advertising gimmick.  It is just computers, being the stupid machines they always are.  They are not smart, their programmers and engineers are smart ... sometimes.  I could use Exchange email as a counter argument ;-)

I was a member of the AAAI in the 80s.  Laymen don't know how their car works.  Pattern recognition is a lot harder than y'all think.  Imitation Game was an entertainment, not how it was actually done ... the pre-computer didn't decode the German messages, it helped narrow the possibilities so that humans could do the actual decoding.  This is a basic pattern recognition test ... can you decode an Enigma message into readable German?

What will make autonomous vehicles work, is their lack of autonomy.  You have to input the whole traffic pattern; streets, lights, all other vehicles etc ... and operate the whole transportation grid as a machine.  But as such, you can't have pedestrians or human driven vehicles on it, no more than you can have private vehicles on railroad tracks.  Railroad tracks are for trains, owned by corporations, not individuals.  Then you can basically run transportation the same way the Soviet Union was run.  There is a reason why Star Trek etc looks like the Soviet Union won the Cold War.  Atheism + technology + socialism.
Title: Re: Driverless cars and morality
Post by: Baruch on December 31, 2016, 07:42:36 AM
Here is a sequence of seemingly random 1/2 digit numbers.  This is an encoding of a common English sentence, so you don't have to adjust for difference of language and culture.  It is encoded with an Enigma derived system I developed as a lark two summers ago ... it is better than Enigma, and in fact is comparable to AES, the current standard for US classified work.  Please perform pattern recognition good enough to decode it, and do that without any human intervention, an algorithm (which comes from where?) is the only thing you can use.

14
32
17
16
22
1
11
1
19
25
7
9
16
27
7
16
36
18
22
35
8
7
32
3
13
29
1
10
2
24
31
6
28
23
9
26
14
18
31
33
21
28

There is a problem in number theory, having to do with the nature of what "pseudo" means in "pseudo-random" numbers.  This is crucial for cryptography.  It is an unsolved fundamental problem by the greatest mathematicians the world's government's can harness.  Basically, unless you can solve that problem (and not all math problems have a solution ... aka the solution is the null set) you can't do true non-human assisted pattern recognition.

One problem I have with AI advocates, is they are nerds in their mom's basement who think that human beings are computers ... or they hope we are, because unless they can build Stepford chicks, they aren't getting any dates ;-)  Human beings aren't electronic, and human thought isn't software.  Thinking this is both a category error, and a profound anti-humanist POV.

See my post I added today, to the Math/Computer section, regarding the fraud occurring in the quantum computing field.  This ties in, because quantum computing ostensibly can solve another fundamental math problem/cryptography problem ... factoring large primes.  Yes, Cold Fusion will save us all, and give us the technological equivalent of the free lunch.
Title: Re: Driverless cars and morality
Post by: Baruch on December 31, 2016, 09:57:34 AM
Professional pattern recognition ...

Medieval Chinese:起信論義疏
Google Translate:From the letter on the sparse
Actual Meaning:Commentary on the Awakening of Faith

Pattern recognition is a set of special techniques, each appropriate for a special problem (domain), each of which don't work on the other domains.  There is no known general algorithm, not even a Turing Machine program ... which is the definition of what an algorithm is.  Therefore, it requires something non-algorithmic to solve problems that are beyond an algorithmically defined pseudo-random number (aka a code of a solution).  There are subsets of algorithms, all of which can be computed by a Turing Machine program, which defines particular classes of pseudo-random numbers.  There is a way on paper, a Turing Machine + Oracle ... that allows one in principle, but not in practice to specify the outlines, but not the specifics, of a non-algorithmicly generated number.  Professional cryptography uses this .. the key is generated by a random natural process, not by an algorithm.  Aka ... CIA, NSA etc assume that ... natural processes are not the result of any digital quantum computer.  An analog quantum computer is a different matter.  In so far as the universe is ruled by quantum mechanics, then it is an analog quantum computer ... but that isn't equivalent to a digital one.  A digital computer can only produce a subset (equivalence set technically, an infinite number of reals are mapped to each integer) of the output of an analog computer.  But this is good enough for things like accounting.  Again, this ties to number theory ... the integers (which can be algorithmically defined by finite processes (in the sense of a Turing Machine) are a proper subset of the real numbers.  Integers can only be used to approximate real numbers, that is not equivalence!  Approximation is an old engineering joke, involving the difference between a mathematician and an engineer, trapped in a room with a beautiful girl, and they can only approach her under certain rules.  The mathematician defeats himself because he is limited by Zeno's Paradox, but the engineer knows he can get what he wants ... with a sufficiently good approximation.

Google Translate uses the most advance language recognition available ... but it is brittle as all AI programs are ... it works real well on toy sentences, but not on more obscure ones, because this particular Medieval Chinese was not in its training data set.  It is much more successful on more recent language quotes (even in Chinese) that are more likely to be similar to its training data set.  This is a real limitation, not imposed by the system that is being trained.  So the technical question is, is the behavior of a totally no-human-in-the-loop transportation system, representable by an algorithmically defined pseudo-random number or not?  This can't be cogitated one way or another ... it can only be empirically demonstrated.

If for example, you have no driver in the car, but it is receiving data from the human driven cars all around ... then in fact, the human traffic is providing indirect steering of the driverless car, as opposed to direct steering by a person in the car.  But that isn't driverless driving, just a more hidden version of driver driving.  I await the results of the empirical demonstration of bumper cars ... but not while I am in traffic thanks.
Title: Re: Driverless cars and morality
Post by: Cavebear on January 01, 2017, 06:16:52 AM
Quote from: Baruch on December 31, 2016, 09:57:34 AM

Google Translate uses the most advance language recognition available ... but it is brittle as all AI programs are ... it works real well on toy sentences, but not on more obscure ones, because this particular Medieval Chinese was not in its training data set.  It is much more successful on more recent language quotes (even in Chinese) that are more likely to be similar to its training data set.  This is a real limitation, not imposed by the system that is being trained. 
I use Google Translate to communicate with a friend in Brazilian Portugese .  Google only has Portugese Portugese.  It makes for some odd errors sometimes.  And as someone with a few years of Latin, I understand some of the basics of all Romance languages. 

My friend can understand most of what I say in Portugese Portugese, but sometimes there are some real surprises!
Title: Re: Driverless cars and morality
Post by: Baruch on January 01, 2017, 12:05:18 PM
Quote from: Cavebear on January 01, 2017, 06:16:52 AM
  I use Google Translate to communicate with a friend in Brazilian Portugese .  Google only has Portugese Portugese.  It makes for some odd errors sometimes.  And as someone with a few years of Latin, I understand some of the basics of all Romance languages. 

My friend can understand most of what I say in Portugese Portugese, but sometimes there are some real surprises!

That is silly.  Most people who speak Portuguese, are in Brazil.  So the training set should be texts from Brazil, not old Portugal.
Title: Re: Driverless cars and morality
Post by: AllPurposeAtheist on January 01, 2017, 12:51:06 PM
Already lots of people are complaining about the idea of autonomous vehicles and not for the reasons you might expect. It has little to nothing to do with morality and everything to do with having to give up on old technology. Folks love the idea of being able to punch the accelerator to zoom past the old guy driving ever so slowly. Personally I prefer to drive the posted speed limit, but that never stops most of the traffic from zooming past or driving way to close to the rear bumper of the car in front of them. They're all in a race to get home to guzzle beer and watch meaningless crap on teeeeveeee..
Title: Re: Driverless cars and morality
Post by: Baruch on January 01, 2017, 01:16:21 PM
Quote from: AllPurposeAtheist on January 01, 2017, 12:51:06 PM
Already lots of people are complaining about the idea of autonomous vehicles and not for the reasons you might expect. It has little to nothing to do with morality and everything to do with having to give up on old technology. Folks love the idea of being able to punch the accelerator to zoom past the old guy driving ever so slowly. Personally I prefer to drive the posted speed limit, but that never stops most of the traffic from zooming past or driving way to close to the rear bumper of the car in front of them. They're all in a race to get home to guzzle beer and watch meaningless crap on teeeeveeee..

Have you hated human drivers your whole life? ;-)  If doing technology X is profitable, it will be done, moral or not.
Title: Re: Driverless cars and morality
Post by: Jason78 on January 02, 2017, 09:33:16 AM
QuoteThe most well-known issues in AV ethics are trolly problemsâ€"moral questions dating back to the era of trollies that ask whose lives should be sacrificed in an unavoidable crash. For instance, if a person falls onto the road in front of a fast-moving AV, and the car can either swerve into a traffic barrier, potentially killing the passenger, or go straight, potentially killing the pedestrian, what should it do?

That's a false dichotomy for a start.   The accepted response in that situation would be to perform an emergency stop.

As the pedestrian has suffered an accident and fallen into the road, the car driver (whether AI or human) has been absolved of moral responsibility.   Accidents happen, and in this case the accident is the result of human error.  No reasonable person would blame the driver if a person fell out into the road and the driver took the action expected of him.
Title: Re: Driverless cars and morality
Post by: Cavebear on March 26, 2018, 02:54:43 AM
Quote from: AllPurposeAtheist on January 01, 2017, 12:51:06 PM
Already lots of people are complaining about the idea of autonomous vehicles and not for the reasons you might expect. It has little to nothing to do with morality and everything to do with having to give up on old technology. Folks love the idea of being able to punch the accelerator to zoom past the old guy driving ever so slowly. Personally I prefer to drive the posted speed limit, but that never stops most of the traffic from zooming past or driving way to close to the rear bumper of the car in front of them. They're all in a race to get home to guzzle beer and watch meaningless crap on teeeeveeee..

I like driving along safely too.  But I don't trust bot drivers.  Not yet,,,
Title: Re: Driverless cars and morality
Post by: Hydra009 on March 26, 2018, 12:14:50 PM
Quote from: Cavebear on March 26, 2018, 02:54:43 AM
I like driving along safely too.  But I don't trust bot drivers.  Not yet,,,
If you're even slightly less likely to get into an accident compared to human driving, it's a worthwhile choice.  Multiply that choice by millions day in and day out, and it's a huge safety improvement.
Title: Re: Driverless cars and morality
Post by: Hydra009 on March 26, 2018, 12:32:00 PM
In the wake of the infamous Uber driverless car accident in which a pedestrian died, this thread may have new relevance.

The Daily Mail has a lovely headline: "Inside Uber's driverless death traps".  Quality journalism as always.  There are a number of similar scare articles (fear sells).

I'd just like to say one thing about this: people die all the time from conventional car accidents and no one bats an eye.  You've all heard about the one driverless car fatality that day, but did you hear about the average of 102* fatalities happening every day in the US?

(* 2016 data.  Source: National Safety Council)
Title: Re: Driverless cars and morality
Post by: AllPurposeAtheist on March 26, 2018, 12:40:39 PM
One question that I haven't seen addressed is at what point do we as a society decide whether we're going to have driverless cars and if so how do we deal with the millions of people who will never want to give up driving themselves?
If, for example it can be proven beyond any reasonable doubt that driverless cars are safer than trusting human judgment just how do you convince the people who still want to drive well over the posted speed limits, take short cuts through the grass and so on that the public safety is more important than their wants and desires? Collectors of vintage automobiles will undoubtedly have a tizzy fit over not being allowed to drive their old heaps not to mention the millions who think of their cars and trucks as sovereign states of their own..
There has to be a tipping point where the old cars that require human interaction get phased out and replaced by autonomous vehicles run and operated by either the state or gawd forbid some corporation with the power to arrest and detain people who still drive.. I can see it becoming another cold dead hands scenario just like the gun issues. 
Enter the new breed of politicians on the side of stupid..as if that's never happened before..
Title: Re: Driverless cars and morality
Post by: trdsf on March 26, 2018, 12:54:03 PM
I forget which thread it was in, and I'm too lazy to look for it.  Automated, driverless cars make excellent sense when most other cars on the road are *also* driverless and will therefore react in predictable, programmatic ways.  I've worked with computers long enough to know when the machine is at fault, and when the machine has failed due to human error.

The victim in the pedestrian accident also appears to have been jaywalking, crossing in the middle of the street.  I haven't seen any information on how much time the car had to process and react to a person where a person should not have been -- but *if* she just popped out into the street, especially *if* she'd been previously hidden by an SUV or other large vehicle... she could have as easily been struck by a human driver.  I wouldn't hold the human driver morally culpable under those circumstances.  Of course, there are several ifs there, and await further information, but I wouldn't be surprised if what I described is what happened.

It's my understanding that Uber vehicles are programmed to "know" where crosswalks are.  Had she crossed where she was supposed to, all other things being equal, she'd very probably be alive right now.

As a daily bicyclist, I *welcome* drivers on the road who aren't texting or talking on their phones or speeding or just not paying any fucking attention to the several tons of metal they're manipulating.  Automated vehicles would make my daily commute much safer, since the humans around here think a red light means "Oh, what the hell, two or three more", that a stop sign means to pull a third of the way out into the cross street before stopping (if at all) and that speed limits are *lower* limits.  I swear sometimes, there's a little flag over my head that says '50 POINTS!'  It's the only explanation I have for the monumentally shitty way drivers are around here.
Title: Re: Driverless cars and morality
Post by: Hydra009 on March 26, 2018, 12:59:26 PM
Quote from: AllPurposeAtheist on March 26, 2018, 12:40:39 PM
One question that I haven't seen addressed is at what point do we as a society decide whether we're going to have driverless cars and if so how do we deal with the millions of people who will never want to give up driving themselves?
Currently (and probably for years to come), both systems will exist together on the same streets.

Eventually though, human-driven cars will be phased out.  My educated guess is that price differences (especially insurance rates) will incentivize autonomous cars to the point that human-driven cars will scarcely exist, except perhaps as a luxury item on private roads.

Most people will voluntarily adjust to the change, but there will undoubtedly be diehards who refuse to change.  I dunno if there's much that can be done to avoid that, though I think the transition should be made as painless as possible.
Title: Re: Driverless cars and morality
Post by: trdsf on March 26, 2018, 01:05:19 PM
Quote from: AllPurposeAtheist on March 26, 2018, 12:40:39 PM
One question that I haven't seen addressed is at what point do we as a society decide whether we're going to have driverless cars and if so how do we deal with the millions of people who will never want to give up driving themselves?
If, for example it can be proven beyond any reasonable doubt that driverless cars are safer than trusting human judgment just how do you convince the people who still want to drive well over the posted speed limits, take short cuts through the grass and so on that the public safety is more important than their wants and desires? Collectors of vintage automobiles will undoubtedly have a tizzy fit over not being allowed to drive their old heaps not to mention the millions who think of their cars and trucks as sovereign states of their own..
There has to be a tipping point where the old cars that require human interaction get phased out and replaced by autonomous vehicles run and operated by either the state or gawd forbid some corporation with the power to arrest and detain people who still drive.. I can see it becoming another cold dead hands scenario just like the gun issues. 
Enter the new breed of politicians on the side of stupid..as if that's never happened before..
I'd say it'll take a combination of the following:
Title: Re: Driverless cars and morality
Post by: Baruch on March 26, 2018, 01:06:32 PM
Quote from: Hydra009 on March 26, 2018, 12:32:00 PM
In the wake of the infamous Uber driverless car accident in which a pedestrian died, this thread may have new relevance.

The Daily Mail has a lovely headline: "Inside Uber's driverless death traps".  Quality journalism as always.  There are a number of similar scare articles (fear sells).

I'd just like to say one thing about this: people die all the time from conventional car accidents and no one bats an eye.  You've all heard about the one driverless car fatality that day, but did you hear about the average of 102* fatalities happening every day in the US?

(* 2016 data.  Source: National Safety Council)

How about the Tesla autopilot accidents?

https://www.csmonitor.com/Business/In-Gear/2016/1014/How-safe-is-Tesla-Autopilot-A-look-at-the-statistics

Talks about the dark art of statistics, to show that their autopilot is safer than a human driver.  Which driver are they comparing to?

Title: Re: Driverless cars and morality
Post by: Baruch on March 26, 2018, 01:07:14 PM
Quote from: AllPurposeAtheist on March 26, 2018, 12:40:39 PM
One question that I haven't seen addressed is at what point do we as a society decide whether we're going to have driverless cars and if so how do we deal with the millions of people who will never want to give up driving themselves?
If, for example it can be proven beyond any reasonable doubt that driverless cars are safer than trusting human judgment just how do you convince the people who still want to drive well over the posted speed limits, take short cuts through the grass and so on that the public safety is more important than their wants and desires? Collectors of vintage automobiles will undoubtedly have a tizzy fit over not being allowed to drive their old heaps not to mention the millions who think of their cars and trucks as sovereign states of their own..
There has to be a tipping point where the old cars that require human interaction get phased out and replaced by autonomous vehicles run and operated by either the state or gawd forbid some corporation with the power to arrest and detain people who still drive.. I can see it becoming another cold dead hands scenario just like the gun issues. 
Enter the new breed of politicians on the side of stupid..as if that's never happened before..

Police State plus SJW ... a marriage made in Heaven.  Or let only Bernie voters drive ;-)
Title: Re: Driverless cars and morality
Post by: Cavebear on March 27, 2018, 02:13:41 AM
I have concerns about driverless cars.  I haven't been in an accident in my life but one dead car at a green light in the dark 40 years ago and it was minor.  I am am a dedicated careful driver.  I watch all around me.  I drive the speed limit and less in the rain.  I am annoyingly legal and "defensive".  I don't yet trust AIs,  but am willing to see them develop.
Title: Re: Driverless cars and morality
Post by: Baruch on March 27, 2018, 06:04:56 AM
Quote from: Cavebear on March 27, 2018, 02:13:41 AM
I have concerns about driverless cars.  I haven't been in an accident in my life but one dead car at a green light in the dark 40 years ago and it was minor.  I am am a dedicated careful driver.  I watch all around me.  I drive the speed limit and less in the rain.  I am annoyingly legal and "defensive".  I don't yet trust AIs,  but am willing to see them develop.

Very reasonable.  I gave a "like" to trdsf recent post ... but not because I like the advocacy.  It is precisely the utopian market intervention I deplore.  Might as well have Castro in charge.  The open admission of "social engineering" intention is what I "liked".  People who want to "save all people from illness" or "save all people from vehicular mayhem" is the kind of megalomania I don't want in politics.  You can talk like that to the television news cast, but I will think you are crazy.  I admit I don't know how to "save all people from vehicular mayhem" ... maybe it is simply nature's way of getting rid of Darwinian losers.
Title: Re: Driverless cars and morality
Post by: aitm on March 27, 2018, 06:24:46 PM
Get me one with a switch to "on". Then when and if I leave the pub a little "later" than normal, I just turn on the auto and voila! I wonder if DUI charge works if the car is in auto?
Title: Re: Driverless cars and morality
Post by: trdsf on March 27, 2018, 07:39:50 PM
Quote from: aitm on March 27, 2018, 06:24:46 PM
Get me one with a switch to "on". Then when and if I leave the pub a little "later" than normal, I just turn on the auto and voila! I wonder if DUI charge works if the car is in auto?
I would say that if you're in a vehicle running on autopilot while you're incapacitated, that can't be a DUI because you're not actually driving.  I mean, you're essentially a passenger, and letting the computer run the car is vastly more responsible than trying to pilot it yourself while three sheets to the wind, yes?
Title: Re: Driverless cars and morality
Post by: Hydra009 on March 27, 2018, 08:14:04 PM
Quote from: Cavebear on March 27, 2018, 02:13:41 AM
I have concerns about driverless cars.  I haven't been in an accident in my life but one dead car at a green light in the dark 40 years ago and it was minor.  I am am a dedicated careful driver.  I watch all around me.  I drive the speed limit and less in the rain.  I am annoyingly legal and "defensive".  I don't yet trust AIs,  but am willing to see them develop.
Ah, but why do you have to drive defensively?  Mass adoption of autonomous cars not only protects others from you, it protects you from others.

Ever see the cops in a high-speed chase through multiple intersections?  Decades from now, that's going to look insanely dangerous and more than a little silly.
Title: Re: Driverless cars and morality
Post by: Baruch on March 27, 2018, 09:30:00 PM
Quote from: Hydra009 on March 27, 2018, 08:14:04 PM
Ah, but why do you have to drive defensively?  Mass adoption of autonomous cars not only protects others from you, it protects you from others.

Ever see the cops in a high-speed chase through multiple intersections?  Decades from now, that's going to look insanely dangerous and more than a little silly.

Particularly when Emperor Palpatine can just shut off your ignition from your Facebook page.
Title: Re: Driverless cars and morality
Post by: aitm on March 29, 2018, 09:42:45 PM
Quote from: trdsf on March 27, 2018, 07:39:50 PM
I would say that if you're in a vehicle running on autopilot while you're incapacitated, that can't be a DUI because you're not actually driving.  I mean, you're essentially a passenger, and letting the computer run the car is vastly more responsible than trying to pilot it yourself while three sheets to the wind, yes?

that's my plan....hehe....Ha-Ha  LOLOL  LOLOL    LOLOOLLOL.....HAHAHAHAHA........er......never mind
Title: Re: Driverless cars and morality
Post by: Cavebear on March 29, 2018, 11:14:28 PM
Quote from: Hydra009 on March 27, 2018, 08:14:04 PM
Ah, but why do you have to drive defensively?  Mass adoption of autonomous cars not only protects others from you, it protects you from others.

Ever see the cops in a high-speed chase through multiple intersections?  Decades from now, that's going to look insanely dangerous and more than a little silly.

I am understanding that driverless cars given a starting and ending point could be good.  Overall I have control concerns, but I would be willing to try one out.  Like from home to the grocery store.  But could it get out of my garage without scraping the sides?  Is there a point where I turn it on and relinquish control.  Can I take over control in weird situations?  I do get into some.
Title: Re: Driverless cars and morality
Post by: Baruch on April 06, 2018, 06:38:44 PM
Quote from: Cavebear on March 29, 2018, 11:14:28 PM
I am understanding that driverless cars given a starting and ending point could be good.  Overall I have control concerns, but I would be willing to try one out.  Like from home to the grocery store.  Vut coyld it get out of my garage without scraping the sides?  Is there a point where I turn it on and relinchiuch control.  Can I take over control in weird situations?  I do get into some.

Tesla Autopilot is like that ... you can choose to control or not.  Like advanced cruise control.  Unfortunately the people in the recent Tesla crash made a poor choice of when to relinquish control.

https://www.wired.com/story/tesla-autopilot-self-driving-crash-california/
Title: Re: Driverless cars and morality
Post by: Cavebear on April 07, 2018, 03:30:55 AM
Quote from: aitm on March 29, 2018, 09:42:45 PM
that's my plan....hehe....Ha-Ha  LOLOL  LOLOL    LOLOOLLOL.....HAHAHAHAHA........er......never mind

I could never be charged with DUI?  Not that I have ever been or should have been, but when I get old and stupid, I can't?