A Windows XP help forum. PCbanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PCbanter forum » Windows 10 » Windows 10 Help Forum
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Microsoft 'Confirms' Windows 7 New Monthly Charge



 
 
Thread Tools Rate Thread Display Modes
  #166  
Old February 14th 19, 04:07 AM posted to alt.comp.os.windows-10
Gene Wirchenko[_2_]
external usenet poster
 
Posts: 496
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

On Wed, 13 Feb 2019 11:46:06 -0700, Ken Blake
wrote:

[snip]

I've long been amused by statements like "you should drive slower on
wet roads."

People think that's the safe thing to do. But say exactly the same
thing in reverse, "you should drive faster on dry roads," and most
people would think it's the dangerous thing to do.


That is not the exact opposite. The exact opposite would be
something like that it is not as dangerous to drive faster on dry
roads.

Sincerely,

Gene Wirchenko
Ads
  #167  
Old February 14th 19, 08:57 AM posted to alt.comp.os.windows-10
Chris
external usenet poster
 
Posts: 832
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

Wolf K wrote:
On 2019-02-13 13:53, Chris wrote:
Wolf K wrote:
On 2019-02-13 05:42, mechanic wrote:
On Wed, 13 Feb 2019 09:21:53 +0000, Chris wrote:

In the UK, if you hit a pedestrian wiht your car you will be
charged with either "Dangerous driving" or "Driving without due
care and attention" and you have to make the case that it was
unavoidable.

Yes we have very much a blame culture in the UK, there's no such
thing as an 'accident'.

I don't see any mention of vehicle emergency braking systems on here
but they are available in many new cars.


Yes, I call that "creeping autonomy". Such driver-assist tech is IMO a
good idea.


Except when you start relying on it and it doesn't work. See the poor woman
in pheonix.


IIRC, that wasn't a "driver assist" tech (such as lane change warnings,
and auto-braking if you get too close to the car ahead). The driver had
set the car to "auto pilot". IIRC, the car was being road tested. The
driver wasn't supposed to let the car do its thing. Then there was the
guy whose car drove under a transport, which decapitated him.


I agree and was my point. As soon as we start relying on the tech it has to
be bulletproof. You can't expect a human to take over with split-second
notice. The testing is flawed IMO. The tech needs to be a backup to the
human, not the other way around.

"Autonomous" is not here yet. Except of course if you put the car on
rails. Oh, yeah, we have that already. "A railroad is a horizontal
elevator", as someone pointed out decades ago. A lot of current subway
and LRT systems run autonomously 90% of the time or more. The driver's
there to handle the exceptions. Much cheaper system than autonomous cars.


Yep. Autonomous cars will only take over in well controlled environments
like motorways, i think. In urban areas, especially outside of the US,
roads are too chaotic and densely populated. Mass transport systems are
better suited.



  #168  
Old February 14th 19, 10:11 AM posted to alt.windows7.general,alt.comp.os.windows-10
Eric Stevens
external usenet poster
 
Posts: 911
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

On Wed, 13 Feb 2019 17:51:28 -0700, Ken Blake
wrote:

On Thu, 14 Feb 2019 12:45:42 +1300, Eric Stevens
wrote:

On Wed, 13 Feb 2019 18:35:22 +0000, "J. P. Gilliver (John)"
wrote:

In message , Mark Lloyd
writes:
On 2/12/19 7:02 PM, Eric Stevens wrote:

[snip]

Person A: "It was sunny yesterday!"
nospam: "not last night, it wasn't!"



Bang on, Char.

Dammit!
I at first wrote that but then thought it was unnecessarily
provocative and deleted it.
(-:

I have deleted a post, but not before someone replied so my words
weren't really deleted.

Even if they hadn't, I doubt you have "deleted your post": unless the
server you are using honours delete requests _and_ processed it before
communicating with its peers, there is little you can do to delete a
post.


I deleted it before I posted.



OK, but in that case, don't say "I have deleted a post." That only
confuses people. If you didn't post it, it wasn't a post.


I didn't say "I have deleted a post". That was Mark Lloyd.
--

Regards,

Eric Stevens
  #169  
Old February 14th 19, 10:13 AM posted to alt.windows7.general,alt.comp.os.windows-10,alt.comp.freeware,alt.conspiracy
Eric Stevens
external usenet poster
 
Posts: 911
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

On Wed, 13 Feb 2019 18:13:20 -0600, Char Jackson
wrote:

On Wed, 13 Feb 2019 14:16:38 +1300, Eric Stevens
wrote:

On Tue, 12 Feb 2019 12:46:17 -0500, nospam
wrote:

In article ,
123456789 wrote:

Pedestrians do NOT have the right of way against a red light at a
traffic light controlled intersection!!! Where in hell did you
get that
idea.

the motor vehicle code.

Not in my state (AZ/US):

(d) Unless otherwise directed by a pedestrian control signal as
provided in section 28-646, a pedestrian facing a steady red signal
alone shall not enter the roadway.

yes in your state:
https://www.azleg.gov/ars/28/00792.htm
28-792. Right-of-way at crosswalk
A. Except as provided in section 28-793, subsection B, if traffic
control signals are not in place or are not in operation, the driver
of a vehicle shall yield the right-of-way, slowing down or stopping
if need be in order to yield, to a pedestrian crossing the roadway
within a crosswalk when the pedestrian is on the half of the roadway
on which the vehicle is traveling or when the pedestrian is
approaching so closely from the opposite half of the roadway as to
be in danger. A pedestrian shall not suddenly leave any curb or other
place of safety and walk or run into the path of a vehicle that is so
close that it is impossible for the driver to yield.


"if traffic control signals are not in place or are not in operation".

See?

He is already determinedly trying to change the context of the
argument.


It's weird. nospam makes an argument and even provides multiple URLs
that he claims will support his argument. The weird thing is that none
of the URLs actually support his argument, so I think we're in for a
round of posts that redefine the initial claim so that the URLs can fit
the situation. AKA 'moving the goalposts'. I quickly lose interest,
which is what he hopes to achieve in the first place.


Several days ago I predicted that something like that would happen.
--

Regards,

Eric Stevens
  #170  
Old February 14th 19, 10:16 AM posted to alt.windows7.general,alt.comp.os.windows-10,alt.comp.freeware,alt.conspiracy
Eric Stevens
external usenet poster
 
Posts: 911
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

On Wed, 13 Feb 2019 21:18:08 -0500, nospam
wrote:

In article , Char Jackson
wrote:

Pedestrians do NOT have the right of way against a red light at a
traffic light controlled intersection!!! Where in hell did you
get that
idea.

the motor vehicle code.

Not in my state (AZ/US):

(d) Unless otherwise directed by a pedestrian control signal as
provided in section 28-646, a pedestrian facing a steady red signal
alone shall not enter the roadway.

yes in your state:
https://www.azleg.gov/ars/28/00792.htm
28-792. Right-of-way at crosswalk
A. Except as provided in section 28-793, subsection B, if traffic
control signals are not in place or are not in operation, the driver
of a vehicle shall yield the right-of-way, slowing down or stopping
if need be in order to yield, to a pedestrian crossing the roadway
within a crosswalk when the pedestrian is on the half of the roadway
on which the vehicle is traveling or when the pedestrian is
approaching so closely from the opposite half of the roadway as to
be in danger. A pedestrian shall not suddenly leave any curb or other
place of safety and walk or run into the path of a vehicle that is so
close that it is impossible for the driver to yield.

"if traffic control signals are not in place or are not in operation".

See?

He is already determinedly trying to change the context of the
argument.


It's weird. nospam makes an argument and even provides multiple URLs
that he claims will support his argument. The weird thing is that none
of the URLs actually support his argument, so I think we're in for a
round of posts that redefine the initial claim so that the URLs can fit
the situation. AKA 'moving the goalposts'. I quickly lose interest,
which is what he hopes to achieve in the first place.


they do support it.


You missd a bit. I put it back.
--

Regards,

Eric Stevens
  #171  
Old February 14th 19, 01:10 PM posted to alt.comp.os.windows-10
Chris
external usenet poster
 
Posts: 832
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

nospam wrote:
In article , Chris
wrote:


Yes, I call that "creeping autonomy". Such driver-assist tech is IMO a
good idea.


Except when you start relying on it and it doesn't work. See the poor woman
in pheonix.


that was human error, not a failure of an autonomous system.

the uber operator was not paying attention and didn't notice anything
wrong until it was too late.

the vehicle's anti-collision system *did* detect the pedestrian, except
that it had been disabled because uber was testing their own system,
which considered it to be a false positive.


Nope. The system failed. It misclassified the woman as a false positive (an
unimportant obstruction like a newspaper). Automated braking was disabled
as a requirement by the authorities to avoid the cars emergency braking
everytime they saw some rubbish on the road.
https://www.extremetech.com/extreme/...false-positive

the person in the vehicle was supposed to be monitoring what was going
on so that the system could learn, except she was watching tv instead.

had she been paying attention, there would not have been a crash.


As i said to wolf, relying on a human to take control in 1.3 seconds is
never going to work.

also, the person who was hit stepped in front of a moving vehicle,
assuming it would stop, and in an area where visibility was limited.
that's not a good strategy.


Can't blame the victim. The car could have and should have avoided her.

autonomous vehicles do not need to be perfect. nothing can be.

they only need to be better than humans, which sadly, is not that hard
to do.


Agree that they don't need to perfect, but the challenge is harder than
originally thought. Full autonomy is a long way off.

more than 1 million people die every year due to motor vehicle crashes,
with ~50 million more injured. that's 2 people killed every *minute*.

https://www.asirt.org/safe-travel/road-safety-facts/

nearly all of those could be avoided.


Most can be saved with better road safety: Tighten car/driver regulation
and improve road layouts. That's what's happened in the UK and we've
amongst the lowest road deaths in the world - despite being a densely
populated country.


  #172  
Old February 14th 19, 02:49 PM posted to alt.comp.os.windows-10
nospam
external usenet poster
 
Posts: 4,718
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

In article , Chris
wrote:

nospam wrote:
In article , Chris
wrote:


Yes, I call that "creeping autonomy". Such driver-assist tech is IMO a
good idea.

Except when you start relying on it and it doesn't work. See the poor woman
in pheonix.


that was human error, not a failure of an autonomous system.

the uber operator was not paying attention and didn't notice anything
wrong until it was too late.

the vehicle's anti-collision system *did* detect the pedestrian, except
that it had been disabled because uber was testing their own system,
which considered it to be a false positive.


Nope. The system failed. It misclassified the woman as a false positive (an
unimportant obstruction like a newspaper). Automated braking was disabled
as a requirement by the authorities to avoid the cars emergency braking
everytime they saw some rubbish on the road.


nope. what failed was the human operator, who was watching tv and not
monitoring the vehicle.

consider what would have happened if a teenager first learning how to
drive was in the same situation, with the driver's ed teacher in the
passenger seat, watching tv and not paying attention.

people with learner's permits crash and even kill people.

https://www.stgeorgeutah.com/news/ar...boy-dies-sever
al-injured-after-learners-permit-driver-crashes-minivan/
https://www.dispatch.com/article/20130628/NEWS/306289669
https://www.newsday.com/long-island/...-crash-that-ki
lls-4-had-learner-permit-1.4086600
https://www.newscentermaine.com/arti...-in-fatal-burn
ham-crash-had-only-a-learners-permit/97-580903130

https://www.extremetech.com/extreme/...bercar-saw-wom
an-called-it-a-false-positive

the person in the vehicle was supposed to be monitoring what was going
on so that the system could learn, except she was watching tv instead.

had she been paying attention, there would not have been a crash.


As i said to wolf, relying on a human to take control in 1.3 seconds is
never going to work.


exactly why autonomous vehicles are so important for reducing crashes,
injuries and fatalities, with reaction time measured in milliseconds,
able to monitor much more than any human ever could and unaffected by
fatigue, drugs or inattention.

also, the person who was hit stepped in front of a moving vehicle,
assuming it would stop, and in an area where visibility was limited.
that's not a good strategy.


Can't blame the victim. The car could have and should have avoided her.


of course the victim can be blamed. stepping in front of a moving
vehicle is a bad idea, regardless what type of driver it is.

autonomous vehicles do not need to be perfect. nothing can be.

they only need to be better than humans, which sadly, is not that hard
to do.


Agree that they don't need to perfect, but the challenge is harder than
originally thought. Full autonomy is a long way off.


it's already happening in some areas.

what will be a long way off is when human driven vehicles are the rare
exception, but that doesn't actually matter.

more than 1 million people die every year due to motor vehicle crashes,
with ~50 million more injured. that's 2 people killed every *minute*.

https://www.asirt.org/safe-travel/road-safety-facts/

nearly all of those could be avoided.


Most can be saved with better road safety: Tighten car/driver regulation
and improve road layouts. That's what's happened in the UK and we've
amongst the lowest road deaths in the world - despite being a densely
populated country.


road safety doesn't fix human error, which is what causes crashes.

people do stupid **** all the time.
  #173  
Old February 14th 19, 04:40 PM posted to alt.comp.os.windows-10
nospam
external usenet poster
 
Posts: 4,718
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

In article , Wolf K
wrote:

road safety doesn't fix human error, which is what causes crashes.


Including design errors made by human engineers....


yep. everything can be traced back to human error.

Stats show that the crash rates have dropped over the last 50 years or
so in developed countries. Fatality rates have dropped even faster.


that's due to safer vehicles. human error rates hasn't changed.

except that crashes have recently began to increase, which has largely
been attributed to texting and driving, a major human error, one which
is entirely eliminated by autonomous vehicles.

https://www.npr.org/2018/08/22/64082...safer-traffic-
fatalities-still-high
From 2014 to 2016, the number of people killed in motor vehicle
collisions jumped from a little over 35,000 to more than 40,000,
before leveling off at about 40,000 fatalities last year, a trend
that appears to be continuing.
  #174  
Old February 14th 19, 05:15 PM posted to alt.comp.os.windows-10
Char Jackson
external usenet poster
 
Posts: 10,449
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

On Thu, 14 Feb 2019 11:40:11 -0500, nospam
wrote:

except that crashes have recently began to increase, which has largely
been attributed to texting and driving, a major human error, one which
is entirely eliminated by autonomous vehicles.

https://www.npr.org/2018/08/22/64082...safer-traffic-
fatalities-still-high
From 2014 to 2016, the number of people killed in motor vehicle
collisions jumped from a little over 35,000 to more than 40,000,
before leveling off at about 40,000 fatalities last year, a trend
that appears to be continuing.


35,000-40,000 people killed every year in vehicle crashes, yet for every
*one* person killed in an autonomous vehicle crash there's a big uproar
about how the machines are dangerous and out to get us. As a nation, we
need to get real and push for faster development and adoption of
autonomous vehicles, rather than holding things back out of fear of the
unknown.

In a discussion with a friend just a few weeks ago, he said he'd rather
have 30,000 people killed by humans than 5,000 people killed by
computers. I think that's misguided and wrong. Of course, *I* could be
wrong.

Yes, there's the hacking aspect. One computer to drive the car until it
crashes, (perhaps from a bad actor interfering), another computer to
say, "I see that the airbags have deployed. Help is on the way. Is
anyone still alive? I'm not detecting any life signs."

On a related note, one of the reasons I use Google to navigate when I'm
driving, instead of or in addition to the onboard nav system, is that
Google knows in real time about traffic problems ahead and can make
instant suggestions about bypassing those problems. In cases where it
can't do a bypass, it alerts me that there's an X minute delay ahead,
which is also useful. The next step, using autonomous vehicles, is to
take that real time navigation and hand it over to the computer that's
driving the car. That integration would be a welcome step forward.

  #175  
Old February 14th 19, 05:47 PM posted to alt.comp.os.windows-10
nospam
external usenet poster
 
Posts: 4,718
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

In article , Char Jackson
wrote:

except that crashes have recently began to increase, which has largely
been attributed to texting and driving, a major human error, one which
is entirely eliminated by autonomous vehicles.

https://www.npr.org/2018/08/22/64082...safer-traffic-
fatalities-still-high
From 2014 to 2016, the number of people killed in motor vehicle
collisions jumped from a little over 35,000 to more than 40,000,
before leveling off at about 40,000 fatalities last year, a trend
that appears to be continuing.


35,000-40,000 people killed every year in vehicle crashes, yet for every
*one* person killed in an autonomous vehicle crash there's a big uproar
about how the machines are dangerous and out to get us. As a nation, we
need to get real and push for faster development and adoption of
autonomous vehicles, rather than holding things back out of fear of the
unknown.


yep. people do not understand math and they also hate change.

people also give a free pass to those who drive on learner's permits,
who also crash and sometimes kill people.

In a discussion with a friend just a few weeks ago, he said he'd rather
have 30,000 people killed by humans than 5,000 people killed by
computers. I think that's misguided and wrong. Of course, *I* could be
wrong.


you're not wrong. 5000 deaths is much better than 30000 deaths for
society, but for the 5000 affected families, not so much.

Yes, there's the hacking aspect. One computer to drive the car until it
crashes, (perhaps from a bad actor interfering), another computer to
say, "I see that the airbags have deployed. Help is on the way. Is
anyone still alive? I'm not detecting any life signs."


there's a hacking aspect for human drivers. it's called carjacking.

auto-dialing emergency services already exists, and not just in cars
either.

On a related note, one of the reasons I use Google to navigate when I'm
driving, instead of or in addition to the onboard nav system, is that
Google knows in real time about traffic problems ahead and can make
instant suggestions about bypassing those problems. In cases where it
can't do a bypass, it alerts me that there's an X minute delay ahead,
which is also useful.


waze works exceptionally well for that.

The next step, using autonomous vehicles, is to
take that real time navigation and hand it over to the computer that's
driving the car. That integration would be a welcome step forward.


actually, that's best done with v2v, vehicle to vehicle communication,
which can send traffic information between vehicles, alerting nearby
drivers of any hazards, *without* needing a separate app or internet
connectivity.

https://en.wikipedia.org/wiki/Vehicular_communication_systems
  #176  
Old February 14th 19, 10:03 PM posted to alt.comp.os.windows-10
Chris
external usenet poster
 
Posts: 832
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

On 14/02/2019 14:49, nospam wrote:
In article , Chris
wrote:

nospam wrote:
In article , Chris
wrote:


Yes, I call that "creeping autonomy". Such driver-assist tech is IMO a
good idea.

Except when you start relying on it and it doesn't work. See the poor woman
in pheonix.

that was human error, not a failure of an autonomous system.

the uber operator was not paying attention and didn't notice anything
wrong until it was too late.

the vehicle's anti-collision system *did* detect the pedestrian, except
that it had been disabled because uber was testing their own system,
which considered it to be a false positive.


Nope. The system failed. It misclassified the woman as a false positive (an
unimportant obstruction like a newspaper). Automated braking was disabled
as a requirement by the authorities to avoid the cars emergency braking
everytime they saw some rubbish on the road.

https://www.extremetech.com/extreme/...bercar-saw-wom
an-called-it-a-false-positive


nope. what failed was the human operator, who was watching tv and not
monitoring the vehicle.


Wrong. The AI should have recognised the bike and/or the person - it is
designed to do so. It did not and misclassified them as a
false-positive. This is the real risk of fully autonomous vehicles they
don't fail safe.

consider what would have happened if a teenager


Irrelevant.



the person in the vehicle was supposed to be monitoring what was going
on so that the system could learn, except she was watching tv instead.

had she been paying attention, there would not have been a crash.


As i said to wolf, relying on a human to take control in 1.3 seconds is
never going to work.


exactly why autonomous vehicles are so important for reducing crashes,
injuries and fatalities, with reaction time measured in milliseconds,
able to monitor much more than any human ever could and unaffected by
fatigue, drugs or inattention.


*If* and only if they're are capable of dealing with edge cases that
unassisted human drivers can. A human driver can pre-empt risky
situations by slowing down or preparing themselves to brake. They pick
up on lots of subtle visual cues. An autonomous car will drive at a
steady speed for as long as it doesn't notice any problems. No human
does that.

also, the person who was hit stepped in front of a moving vehicle,
assuming it would stop, and in an area where visibility was limited.
that's not a good strategy.


Can't blame the victim. The car could have and should have avoided her.


of course the victim can be blamed. stepping in front of a moving
vehicle is a bad idea, regardless what type of driver it is.


Funny, you argued exactly the opposite elsethread regarding drivers
"slamming on brakes".

autonomous vehicles do not need to be perfect. nothing can be.

they only need to be better than humans, which sadly, is not that hard
to do.


Agree that they don't need to perfect, but the challenge is harder than
originally thought. Full autonomy is a long way off.


it's already happening in some areas.

what will be a long way off is when human driven vehicles are the rare
exception, but that doesn't actually matter.


And that's where the real challenge is; a mixed economy of autonomous
and human drivers. It is forcing two disparate systems to coexist.
Autonomous systems would work much better without chaotic human around.

more than 1 million people die every year due to motor vehicle crashes,
with ~50 million more injured. that's 2 people killed every *minute*.

https://www.asirt.org/safe-travel/road-safety-facts/

nearly all of those could be avoided.


Most can be saved with better road safety: Tighten car/driver regulation
and improve road layouts. That's what's happened in the UK and we've
amongst the lowest road deaths in the world - despite being a densely
populated country.


road safety doesn't fix human error, which is what causes crashes.


Sure it does. It is the major reason why accident rates have fallen over
the years. Better signage, clear road markings, lighting at dangerous
junctions, speed bumps etc all contribute by giving the human less
chance to make errors or at least recover if they do.

people do stupid **** all the time.


Yep and they're not dying because of it as much as they used to.

  #177  
Old February 14th 19, 10:59 PM posted to alt.comp.os.windows-10
nospam
external usenet poster
 
Posts: 4,718
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

In article , Chris
wrote:

Yes, I call that "creeping autonomy". Such driver-assist tech is IMO a
good idea.

Except when you start relying on it and it doesn't work. See the poor
woman
in pheonix.

that was human error, not a failure of an autonomous system.

the uber operator was not paying attention and didn't notice anything
wrong until it was too late.

the vehicle's anti-collision system *did* detect the pedestrian, except
that it had been disabled because uber was testing their own system,
which considered it to be a false positive.

Nope. The system failed. It misclassified the woman as a false positive (an
unimportant obstruction like a newspaper). Automated braking was disabled
as a requirement by the authorities to avoid the cars emergency braking
everytime they saw some rubbish on the road.


https://www.extremetech.com/extreme/...h-ubercar-saw-
woman-called-it-a-false-positive


nope. what failed was the human operator, who was watching tv and not
monitoring the vehicle.


Wrong. The AI should have recognised the bike and/or the person - it is
designed to do so. It did not and misclassified them as a
false-positive.


in other words, it *did* recognize it.

human drivers misclassify stuff all the time far more than one single
occurrence.

autonomous vehicles are learning how to drive. they will make mistakes,
just like people do, and they will learn from them, just like some
people do.

This is the real risk of fully autonomous vehicles they
don't fail safe.


people don't fail safe at all. they guess and often guess wrong. worse,
they lack training for the edge cases and often panic, making the
situation worse. driver's ed does not teach edge cases.

autonomous vehicles only need to do better than humans. they do *not*
need to avoid *all* possible situations, nor can they.

consider what would have happened if a teenager


Irrelevant.


it's not irrelevant.

autonomous vehicles are learning how to drive, just like teenagers are.

if we accept teens with zero experience on public roads, then we must
also accept autonomous vehicles on public roads, which have *far* more
hours of experience in its algorithms and on test tracks.

people with learner's permits are dangerous:
https://www.centralmoinfo.com/2018/0...lled-in-accide
nt-involving-learners-permit-driver/
https://www.stltoday.com/news/local/...-driver-with-l
earner-s-permit-killed-in-jersey-county/article_6da7412b-ea40-57fe-9842-
7b97e17720f6.html
https://www.denverpost.com/2016/08/2...l-drag-racing-
crash-only-had-learners-permit/
https://www.nhregister.com/news/arti...LD-Driver-in-c
rash-that-killed-11619669.php

the person in the vehicle was supposed to be monitoring what was going
on so that the system could learn, except she was watching tv instead.

had she been paying attention, there would not have been a crash.

As i said to wolf, relying on a human to take control in 1.3 seconds is
never going to work.


exactly why autonomous vehicles are so important for reducing crashes,
injuries and fatalities, with reaction time measured in milliseconds,
able to monitor much more than any human ever could and unaffected by
fatigue, drugs or inattention.


*If* and only if they're are capable of dealing with edge cases that
unassisted human drivers can.


human drivers are *horrible* at edge cases. they don't know what to do,
often panic and almost always do the wrong thing, making it worse.

driver's ed only teaches the basics, not when **** happens.

not only that, but once humans get a license, it's forever. there is no
retesting. just go in for an eye exam and pay the fee.

human drivers should have mandatory testing every 5-10 years and if
they fail, their license is revoked. that won't ever happen, so we get
to deal with incompetent drivers and as a result, dead people.

A human driver can pre-empt risky
situations by slowing down or preparing themselves to brake. They pick
up on lots of subtle visual cues.


no they definitely don't.

autonomous vehicles can 'see' far data than humans ever could possibly
do.

humans have tunnel vision. they rarely know what's in their blind spots
or behind, they're easily distracted, subject to fatigue, intoxication
and more. a common excuse for a crash is 'i didn't see him'.

autonomous vehicles will never have that excuse. they always see, and
not just ahead.

radar, lidar and 360 degree cameras are always on, scanning everything,
regardless of daylight/night, rain, fog, etc. wheel sensors monitor
traction and can *immediately* adjust if any wheel starts to slip. v2v
communicates with other vehicles to signal its intent and learn of the
intents of other vehicles and any road hazards. onboard computers can
analyze all possible outcomes in microseconds, making the safest choice
possible.

humans can't come anywhere close to that.

An autonomous car will drive at a
steady speed for as long as it doesn't notice any problems. No human
does that.


which is one reason why humans are bad drivers.

there's no reason not to be at a steady speed if there are no problems.
in fact, that's the safest thing to do.

constant varying of speed causes traffic bunching, which raises the
risk for collisions.

also, the person who was hit stepped in front of a moving vehicle,
assuming it would stop, and in an area where visibility was limited.
that's not a good strategy.

Can't blame the victim. The car could have and should have avoided her.


of course the victim can be blamed. stepping in front of a moving
vehicle is a bad idea, regardless what type of driver it is.


Funny, you argued exactly the opposite elsethread regarding drivers
"slamming on brakes".


two completely separate and unrelated things.

an autonomous vehicle could know about the pedestrian long before it
was even a factor, therefore no reason to slam on the brakes.

autonomous vehicles do not need to be perfect. nothing can be.

they only need to be better than humans, which sadly, is not that hard
to do.

Agree that they don't need to perfect, but the challenge is harder than
originally thought. Full autonomy is a long way off.


it's already happening in some areas.

what will be a long way off is when human driven vehicles are the rare
exception, but that doesn't actually matter.


And that's where the real challenge is; a mixed economy of autonomous
and human drivers. It is forcing two disparate systems to coexist.
Autonomous systems would work much better without chaotic human around.


mixed is not an issue whatsoever.

more than 1 million people die every year due to motor vehicle crashes,
with ~50 million more injured. that's 2 people killed every *minute*.

https://www.asirt.org/safe-travel/road-safety-facts/

nearly all of those could be avoided.

Most can be saved with better road safety: Tighten car/driver regulation
and improve road layouts. That's what's happened in the UK and we've
amongst the lowest road deaths in the world - despite being a densely
populated country.


road safety doesn't fix human error, which is what causes crashes.


Sure it does. It is the major reason why accident rates have fallen over
the years. Better signage, clear road markings, lighting at dangerous
junctions, speed bumps etc all contribute by giving the human less
chance to make errors or at least recover if they do.


nope. the main reason is safer vehicles, including having crumple
zones, airbags, anti-lock brakes, adaptive cruise control and other
technology.

speed bumps don't do anything other than annoy people, with some of
them being less of a problem at *faster* speeds.

people do stupid **** all the time.


Yep and they're not dying because of it as much as they used to.


because of safer vehicles.
  #178  
Old February 15th 19, 12:21 AM posted to alt.comp.os.windows-10
Paul[_32_]
external usenet poster
 
Posts: 11,873
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

Chris wrote:


Wrong. The AI should have recognised the bike and/or the person - it is
designed to do so. It did not and misclassified them as a
false-positive. This is the real risk of fully autonomous vehicles they
don't fail safe.


https://www.wired.com/story/uber-sel...a-ntsb-report/

"The car radar and lidar sensors detected Herzberg about
six seconds before the crash - first identifying her as
an unknown object, then as a vehicle, and then as a bicycle,
each time adjusting its expectations for her path of travel.

About a second before impact, the report says "the self-driving
system determined that an emergency braking maneuver was needed
to mitigate a collision." Uber, however, does not allow its
system to make emergency braking maneuvers on its own. === Strike 1

Furthermore, Uber had turned off the Volvo built-in automatic
emergency braking system to avoid clashes with its own tech. === Strike 2

[Yadda Yadda safety driver...] === Strike 3
"

The police initially released a deceptively-processed dashcam
video, darkened to make it look like the victim "challenged"
the car by way of crossing at a "dark point". The problem with
this deception, is LIDAR works without street lighting, and
one weakness of LIDAR is perfectly stationary objects
"disappear" into the ambient background (buildings, telephone
poles, fire hydrants). Since Herzburg was slowly advancing
across the street, this would cause the LIDAR to see the
relative motion.

The classifiers used are pretty good. I looked at the results
of a "student" LIDAR project, where they only had one laser
and the dot map was pretty sparse. And yet, in some test
footage, it classified pedestrians and cyclists standing
on the side of the road, ready to enter the traffic stream.
So even with absolutely crappy LIDAR, it's still possible
to pick up stuff. Having read a few of the articles of
this sort, I always thought it comical that bicyclists
received that much attention. I don't remember, for example,
a classifier highlighting a motor cycle as a motor cycle.
But lots of test footage has bicycles, as if they're the
cream of the crop for accidents.

But there will be cases in future, where the S/N will be
so bad, that no subsystem in the car will have a good view
of the road. I've not seen any mention yet, of
an autonomous car that "refuses to drive because conditions
are too bad". Maybe this is because of the human project leaders
that are carefully selecting "fully mapped" dry temperate
roads for the cars to drive on.

What I feel is wrong with the approach the engineers are using,
is "early optimization" and selecting the sensor suite before
the driving code is finished. They haven't attacked enough
challenging terrain yet, to be selecting just one system
for looking ahead. It's almost like they don't want to
do sensor fusion. Cost reduction is what you do *after*
you perfect the Mark 1 version. (It would be like making the
walls thinner on the first Fusion Reactor so it could be
built cheaply.) If they had more sensor subsystems than
absolutely necessary, they could write additional code
to show how a warm standby sensor would have affected
decisions in tricky situations.

I also don't buy the "two emergency stop systems would be bad".
You give one priority, on the assumption the "single CPU" on
the autonomous section of the car has crashed, and the braking
system the car has functions as your "line of last defense".

If your code is so bad the car is "making too many emergency stops",
what does that tell you ? You don't disable the braking code FFS.
You fix the code to work properly. Or you install enough sensors,
to increase the signal available, and use sensor fusion to make
better decisions. Or alternately, you abandon autonomous cars
as a bad idea (you cannot abandon the notion of "engineering"
by giving control to "software wienies" - the requirements
come perilously close to "sentience", not "mechanization").

Paul
  #179  
Old February 15th 19, 01:49 AM posted to alt.comp.os.windows-10,alt.windows7.general
J. P. Gilliver (John)[_4_]
external usenet poster
 
Posts: 2,679
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

(When you change the followup-to 'groups in a thread [e. g. by removing
some], it's usually considered polite to say you are doing so.)

In message , mechanic
writes:
On Wed, 13 Feb 2019 09:21:53 +0000, Chris wrote:

In the UK, if you hit a pedestrian wiht your car you will be
charged with either "Dangerous driving" or "Driving without due
care and attention" and you have to make the case that it was
unavoidable.


Yes we have very much a blame culture in the UK, there's no such
thing as an 'accident'.


Has there ever been a case where - *especially where there is a
pedestrian light against which they crossed* - a _pedestrian_ has been
penalised, especially if they caused a vehicle to swerve to avoid them
and hit another vehicle, street furniture, etcetera? (_Harder_ to prove
as - AFAIK! - "leaving the scene of an accident" may not be an offence,
and is certainly easier, for pedestrians.)

In UK, AFAIK, it isn't actually an _offence_ - though it may be stupid -
for a _pedestrian_ to cross against a red light, though it is for a
vehicle. I think it is in Germany (or was about 40 years ago, and I
doubt it's changed); and I have heard mention of "jaywalking" in the
USA, so I think it is there too?

I don't see any mention of vehicle emergency braking systems on here
but they are available in many new cars.


Do you mean _automatic_ such systems (i. e.
obstruction-sensor-triggered)?
--
J. P. Gilliver. UMRA: 1960/1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

The thing about smut is it harms no one and it's rarely cruel. Besides, it's a
gleeful rejection of the dreary and the "correct".
- Alison Graham, RT 2014/10/25-31
  #180  
Old February 15th 19, 02:50 AM posted to alt.comp.os.windows-10,alt.windows7.general
123456789[_3_]
external usenet poster
 
Posts: 239
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

On 2/14/2019 6:49 PM, J. P. Gilliver (John) wrote:

Has there ever been a case where - *especially where there is a
pedestrian light against which they crossed* - a _pedestrian_ has
been penalised


In downtown Phoenix (AZ/US) cops periodically run pedestrian stings and
write hundreds of tickets to the jaywalkers, red light walkers, etc, who
tie up traffic and cause near accidents. After the word gets around
things get better for awhile until after some time it needs to be done
again.

leaving the scene of an accident" may not be an offence, and is
certainly easier, for pedestrians.


True. But if they can be later located they can be cited for the
accident cause. So walk fast...

 




Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off






All times are GMT +1. The time now is 01:17 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 PCbanter.
The comments are property of their posters.