A Windows XP help forum. PCbanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » PCbanter forum » Windows 10 » Windows 10 Help Forum
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Microsoft 'Confirms' Windows 7 New Monthly Charge



 
 
Thread Tools Rate Thread Display Modes
  #181  
Old February 15th 19, 05:31 AM posted to alt.comp.os.windows-10,alt.windows7.general
nospam
external usenet poster
 
Posts: 4,718
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

In article , J. P. Gilliver (John)
wrote:

In the UK, if you hit a pedestrian wiht your car you will be
charged with either "Dangerous driving" or "Driving without due
care and attention" and you have to make the case that it was
unavoidable.


Yes we have very much a blame culture in the UK, there's no such
thing as an 'accident'.


Has there ever been a case where - *especially where there is a
pedestrian light against which they crossed* - a _pedestrian_ has been
penalised, especially if they caused a vehicle to swerve to avoid them
and hit another vehicle, street furniture, etcetera? (_Harder_ to prove
as - AFAIK! - "leaving the scene of an accident" may not be an offence,
and is certainly easier, for pedestrians.)


if that does happen, it's very rare. cops like to go after the drivers
because the fines are a lot higher. it's all about money and quotas.

In UK, AFAIK, it isn't actually an _offence_ - though it may be stupid -
for a _pedestrian_ to cross against a red light, though it is for a
vehicle. I think it is in Germany (or was about 40 years ago, and I
doubt it's changed); and I have heard mention of "jaywalking" in the
USA, so I think it is there too?


jaywalking does exist and is generally stupid, however, it's not always
illegal, and in new york city, jaywalking is expected.
Ads
  #182  
Old February 15th 19, 05:31 AM posted to alt.comp.os.windows-10
nospam
external usenet poster
 
Posts: 4,718
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

In article , Paul
wrote:

Wrong. The AI should have recognised the bike and/or the person - it is
designed to do so. It did not and misclassified them as a
false-positive. This is the real risk of fully autonomous vehicles they
don't fail safe.


https://www.wired.com/story/uber-sel...a-ntsb-report/

"The car radar and lidar sensors detected Herzberg about
six seconds before the crash - first identifying her as
an unknown object, then as a vehicle, and then as a bicycle,
each time adjusting its expectations for her path of travel.


6 seconds is a long time, during which evasive action could have been
taken to avoid hitting the pedestrian.

the operator was *watching* *tv*, not paying attention to what was
going on. had she been paying attention, she could have avoided the
collision. even honking the horn would have helped.

About a second before impact, the report says "the self-driving
system determined that an emergency braking maneuver was needed
to mitigate a collision." Uber, however, does not allow its
system to make emergency braking maneuvers on its own. ===
Strike 1


that's not a strike.

again, the operator was *watching* *tv* while all this was going on.

had she been watching the road and the indicators in the vehicle, she
would have been aware of a pedestrian ~6 seconds earlier.

consider what would have happened if a human driver was looking down at
a phone and not at the road.

Furthermore, Uber had turned off the Volvo built-in automatic
emergency braking system to avoid clashes with its own tech. ===
Strike 2


yep, because uber was testing their own system.

having two conflicting systems active at the same time won't work very
well.

[Yadda Yadda safety driver...] === Strike 3


which was the main contributing factor.

The police initially released a deceptively-processed dashcam
video, darkened to make it look like the victim "challenged"
the car by way of crossing at a "dark point". The problem with
this deception, is LIDAR works without street lighting, and
one weakness of LIDAR is perfectly stationary objects
"disappear" into the ambient background (buildings, telephone
poles, fire hydrants).


nope. nothing disappears and those aren't stationary relative to the
vehicle either.

Since Herzburg was slowly advancing
across the street, this would cause the LIDAR to see the
relative motion.


it saw the pedestrian.

the problem is that it assumed it was a false positive for several
reasons, one of which is that there was no reason for a pedestrian to
be at that point in the roadway.
  #183  
Old February 15th 19, 10:51 PM posted to alt.comp.os.windows-10
Chris
external usenet poster
 
Posts: 832
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

nospam wrote:
In article , Chris
wrote:

Nope. The system failed. It misclassified the woman as a false positive (an
unimportant obstruction like a newspaper). Automated braking was disabled
as a requirement by the authorities to avoid the cars emergency braking
everytime they saw some rubbish on the road.


https://www.extremetech.com/extreme/...h-ubercar-saw-
woman-called-it-a-false-positive

nope. what failed was the human operator, who was watching tv and not
monitoring the vehicle.


Wrong. The AI should have recognised the bike and/or the person - it is
designed to do so. It did not and misclassified them as a
false-positive.


in other words, it *did* recognize it.


You are being intentionally obtuse.

As i said to wolf, relying on a human to take control in 1.3 seconds is
never going to work.

exactly why autonomous vehicles are so important for reducing crashes,
injuries and fatalities, with reaction time measured in milliseconds,
able to monitor much more than any human ever could and unaffected by
fatigue, drugs or inattention.


*If* and only if they're are capable of dealing with edge cases that
unassisted human drivers can.


human drivers are *horrible* at edge cases.


The AIs aren't great either...

not only that, but once humans get a license, it's forever.


No it isn't. If they seriously screw up, they get banned, made to resit
their test or even go to jail.


A human driver can pre-empt risky
situations by slowing down or preparing themselves to brake. They pick
up on lots of subtle visual cues.


no they definitely don't.

autonomous vehicles can 'see' far data than humans ever could possibly
do.


And yet function only half as well.

humans have tunnel vision. they rarely know what's in their blind spots
or behind, they're easily distracted, subject to fatigue, intoxication
and more. a common excuse for a crash is 'i didn't see him'.

autonomous vehicles will never have that excuse. they always see, and
not just ahead.


Blind faith. I'd like to see them negotiate the roads around here.

radar, lidar and 360 degree cameras are always on, scanning everything,
regardless of daylight/night, rain, fog, etc. wheel sensors monitor


Rain? Not yet. It really screws their perception.

traction and can *immediately* adjust if any wheel starts to slip. v2v
communicates with other vehicles to signal its intent and learn of the
intents of other vehicles and any road hazards. onboard computers can
analyze all possible outcomes in microseconds, making the safest choice
possible.


All theoretical. Real world is very different.

humans can't come anywhere close to that.


And yet can still manage to drive several hundred million miles for every
single death. Can always be better, but still pretty good for a lump of
chemical reactions.

An autonomous car will drive at a
steady speed for as long as it doesn't notice any problems. No human
does that.


which is one reason why humans are bad drivers.


Nope.

there's no reason not to be at a steady speed if there are no problems.
in fact, that's the safest thing to do.


Nope.

constant varying of speed causes traffic bunching, which raises the
risk for collisions.


Nope. Collisions are caused by the impatient driving too fast or too close.


also, the person who was hit stepped in front of a moving vehicle,
assuming it would stop, and in an area where visibility was limited.
that's not a good strategy.

Can't blame the victim. The car could have and should have avoided her.

of course the victim can be blamed. stepping in front of a moving
vehicle is a bad idea, regardless what type of driver it is.


Funny, you argued exactly the opposite elsethread regarding drivers
"slamming on brakes".


two completely separate and unrelated things.

an autonomous vehicle could know about the pedestrian long before it
was even a factor, therefore no reason to slam on the brakes.


The only course of action the car could have done to avoid the crash was
an emergency brake, but it was disabled. So, yes it would have slammed on
the brakes.


road safety doesn't fix human error, which is what causes crashes.


Sure it does. It is the major reason why accident rates have fallen over
the years. Better signage, clear road markings, lighting at dangerous
junctions, speed bumps etc all contribute by giving the human less
chance to make errors or at least recover if they do.


nope. the main reason is safer vehicles, including having crumple
zones, airbags, anti-lock brakes, adaptive cruise control and other
technology.


I said accidents not deaths. Crumple zones and airbags don't avoid
accidents. Improvements in both cars and roads have helped reduce accidents
and deaths.


  #184  
Old February 15th 19, 10:51 PM posted to alt.comp.os.windows-10
Chris
external usenet poster
 
Posts: 832
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

Wolf K wrote:
On 2019-02-14 19:21, Paul wrote:
Chris wrote:


Wrong. The AI should have recognised the bike and/or the person - it
is designed to do so. It did not and misclassified them as a
false-positive. This is the real risk of fully autonomous vehicles
they don't fail safe.


https://www.wired.com/story/uber-sel...a-ntsb-report/

[...]

Thanks for this. Clears things up a lot.


+1

  #185  
Old February 16th 19, 01:33 AM posted to alt.comp.os.windows-10
nospam
external usenet poster
 
Posts: 4,718
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

In article , Chris
wrote:


not only that, but once humans get a license, it's forever.


No it isn't. If they seriously screw up, they get banned, made to resit
their test or even go to jail.


yes, if they seriously screw up. otherwise, it's forever.

the point is that there is no periodic testing to see if a driver is
still competent.

most people don't seriously screw up, so they keep on driving even
though they are no longer able to safely do so.

if someone passed at age 16, all they need to do is pass an eye test
(with or without corrective lenses) and pay the fee.

the reaction time of someone in their 80s is not what it was when they
were a teen or 20something.

A human driver can pre-empt risky
situations by slowing down or preparing themselves to brake. They pick
up on lots of subtle visual cues.


no they definitely don't.

autonomous vehicles can 'see' far data than humans ever could possibly
do.


And yet function only half as well.


actually much better. significantly so.

almost every crash involving an autonomous vehicle was due to a human
driven vehicle, either the operator taking over and driving it manually
or another human driven vehicle colliding with it.



radar, lidar and 360 degree cameras are always on, scanning everything,
regardless of daylight/night, rain, fog, etc. wheel sensors monitor


Rain? Not yet. It really screws their perception.


false. radar works fine in the rain and lidar is getting to that point.

https://www.draper.com/news-releases...ather-detectio
n-lidar
Recently, a team of Draper engineers tackled this problem and
successfully demonstrated a LiDAR system that can see through dense
fog. In one test, the team filled a hockey rink with fog so dense
human vision could barely see an object 30 meters away. Draperıs
LiDAR system was able to see objects 54 meters away, almost twice
that distance, on the other side of the rink.

traction and can *immediately* adjust if any wheel starts to slip. v2v
communicates with other vehicles to signal its intent and learn of the
intents of other vehicles and any road hazards. onboard computers can
analyze all possible outcomes in microseconds, making the safest choice
possible.


All theoretical. Real world is very different.


it's not theoretical. it's happening *today*.

humans can't come anywhere close to that.


And yet can still manage to drive several hundred million miles for every
single death. Can always be better, but still pretty good for a lump of
chemical reactions.


more than 1.2 million deaths worldwide per year is not in any way good.

An autonomous car will drive at a
steady speed for as long as it doesn't notice any problems. No human
does that.


which is one reason why humans are bad drivers.


Nope.

there's no reason not to be at a steady speed if there are no problems.
in fact, that's the safest thing to do.


Nope.

constant varying of speed causes traffic bunching, which raises the
risk for collisions.


Nope.


wrong to all three.

Collisions are caused by the impatient driving too fast or too close.


nope. crashes are caused by human error, including driving too fast
*for* *conditions*, which is *not* the same as driving too fast. the
number on the signpost is artificially low to maximize revenue, not
safety.

autonomous vehicles will eliminate human error, which not only means
fewer crashes, but it also means vehicles can be closer together,
thereby increasing road capacity and greatly reducing traffic problems.

also, the person who was hit stepped in front of a moving vehicle,
assuming it would stop, and in an area where visibility was limited.
that's not a good strategy.

Can't blame the victim. The car could have and should have avoided her.

of course the victim can be blamed. stepping in front of a moving
vehicle is a bad idea, regardless what type of driver it is.

Funny, you argued exactly the opposite elsethread regarding drivers
"slamming on brakes".


two completely separate and unrelated things.

an autonomous vehicle could know about the pedestrian long before it
was even a factor, therefore no reason to slam on the brakes.


The only course of action the car could have done to avoid the crash was
an emergency brake, but it was disabled. So, yes it would have slammed on
the brakes.


absolutely wrong.

again, the death was the fault of the operator, who was watching tv
instead of paying attention to the road *and* the pedestrian, who
stepped in front of a moving vehicle outside of a marked crosswalk at
night, and who also tested positive for methamphetamine and marijuana.

the autonomous system detected the pedestrian more than 5 seconds
before impact, which was more than enough time to have taken action.

https://www.abc15.com/news/region-so...tempe-police-r
elease-new-video-from-deadly-self-driving-uber-crash
Officers calculated that had Vasquez been paying attention, she could
have reacted 143 feet before impact and brought the SUV to a stop
about 42.6 feet before hitting Herzberg.

"This crash would not have occurred if Vasquez would have been
monitoring the vehicle and roadway conditions and was not
distracted," the report stated.
....
The report also says Herzberg unlawfully crossing the road at an
unmarked location was a factor in the crash.

and because of that, the operator of the uber vehicle is being charged
with manslaughter:

https://www.azcentral.com/story/news...22/self-drivin
g-uber-fatal-crash-prosecution-may-precedent-setting/726652002/
Investigators later determined the*crash would not have occurred if
Vasquez had been "monitoring the vehicle and roadway conditions and
was not distracted.²*

...A police report*indicated investigators are seeking a manslaughter
charge*against Vasquez.
....
"She was clearly derelict in her job and that she knew that was not
permitted and, as a result of that, an innocent person was killed,"
Marchant said.*



road safety doesn't fix human error, which is what causes crashes.

Sure it does. It is the major reason why accident rates have fallen over
the years. Better signage, clear road markings, lighting at dangerous
junctions, speed bumps etc all contribute by giving the human less
chance to make errors or at least recover if they do.


nope. the main reason is safer vehicles, including having crumple
zones, airbags, anti-lock brakes, adaptive cruise control and other
technology.


I said accidents not deaths.


crashes, not accidents. the cause is almost always human error, which
means the crashes could have been prevented.

Crumple zones and airbags don't avoid
accidents. Improvements in both cars and roads have helped reduce accidents
and deaths.


cars are indeed safer and crash barriers on roadways help, but that's
really about it.

humans still **** up, a lot.
  #186  
Old February 16th 19, 02:03 AM posted to alt.comp.os.windows-10
123456789[_3_]
external usenet poster
 
Posts: 239
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

On 2/15/2019 6:33 PM, nospam wrote:

the reaction time of someone in their 80s is not what it was when they
were a teen


The teen needs that extra fast reaction time to survive his extra fast
driving. (At least I did)...
  #187  
Old February 16th 19, 02:11 AM posted to alt.comp.os.windows-10
nospam
external usenet poster
 
Posts: 4,718
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

In article , Wolf K
wrote:

not only that, but once humans get a license, it's forever.
No it isn't. If they seriously screw up, they get banned, made to resit
their test or even go to jail.

yes, if they seriously screw up. otherwise, it's forever.

[...]

In Ontario, you must pass an eye test, cognitive test, and sometimes
also a driving test, every two years from age 80 onward. I think it
should start at a younger age, but not enough over-65 voters agree with me.


of course not. they don't want to risk losing their license.
  #188  
Old February 17th 19, 03:09 PM posted to alt.comp.os.windows-10
Chris
external usenet poster
 
Posts: 832
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

Wolf K wrote:
On 2019-02-15 20:33, nospam wrote:
In ,
wrote:


not only that, but once humans get a license, it's forever.
No it isn't. If they seriously screw up, they get banned, made to resit
their test or even go to jail.

yes, if they seriously screw up. otherwise, it's forever.

[...]

In Ontario, you must pass an eye test, cognitive test, and sometimes
also a driving test, every two years from age 80 onward. I think it
should start at a younger age, but not enough over-65 voters agree with me.


Actually, older drivers are generally safe. It's the young (men) that need
to be carefully checked in terms of the biggest risk to other road users.

You may have heard the Queen's husband recently had a potentially serious
accident which raised the question of the driver's age: he's 96. There are
regular stories of elderly drivers doing daft things, but that's the
exception.

  #189  
Old February 17th 19, 03:34 PM posted to alt.comp.os.windows-10
nospam
external usenet poster
 
Posts: 4,718
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

In article , Chris
wrote:

not only that, but once humans get a license, it's forever.
No it isn't. If they seriously screw up, they get banned, made to resit
their test or even go to jail.
yes, if they seriously screw up. otherwise, it's forever.

[...]

In Ontario, you must pass an eye test, cognitive test, and sometimes
also a driving test, every two years from age 80 onward. I think it
should start at a younger age, but not enough over-65 voters agree with me.


Actually, older drivers are generally safe. It's the young (men) that need
to be carefully checked in terms of the biggest risk to other road users.


nope. new drivers and elderly drivers are the highest risk due to lack
of skill and loss of ability, respectively.

You may have heard the Queen's husband recently had a potentially serious
accident which raised the question of the driver's age: he's 96. There are
regular stories of elderly drivers doing daft things, but that's the
exception.


he no longer has his license to drive. revocation would look bad for
the royals, so he 'voluntarily' surrendered it.
  #190  
Old February 17th 19, 03:35 PM posted to alt.comp.os.windows-10
Chris
external usenet poster
 
Posts: 832
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

nospam wrote:
In article , Chris
wrote:


not only that, but once humans get a license, it's forever.


No it isn't. If they seriously screw up, they get banned, made to resit
their test or even go to jail.


yes, if they seriously screw up. otherwise, it's forever.

the point is that there is no periodic testing to see if a driver is
still competent.


Every day is a test. If you don't crash, you've passed! Therefore are
clearly competent. 99.999% are perfectly fine and don't need periodic
testing. It's just a waste of time, money and effort.

most people don't seriously screw up, so they keep on driving even
though they are no longer able to safely do so.


If they don't screw up they are, by definition, safe.

if someone passed at age 16, all they need to do is pass an eye test
(with or without corrective lenses) and pay the fee.

the reaction time of someone in their 80s is not what it was when they
were a teen or 20something.


Older drivers are more careful and drive slower. It evens out.

A human driver can pre-empt risky
situations by slowing down or preparing themselves to brake. They pick
up on lots of subtle visual cues.

no they definitely don't.

autonomous vehicles can 'see' far data than humans ever could possibly
do.


And yet function only half as well.


actually much better. significantly so.


I won't bother asking for proof. You never have any...

almost every crash involving an autonomous vehicle was due to a human
driven vehicle, either the operator taking over and driving it manually
or another human driven vehicle colliding with it.


Of the top of my head, I know three which directly car errors. Obviously
ignoring the uber car.

1) The very first Tesla car death, the car didn't see the left-turning
truck.
2) there was another where the car drove into the back of a fire engine
3) another Tesla fatality was when the driver was asleep and it accelerated
into a barrier.


Collisions are caused by the impatient driving too fast or too close.


nope. crashes are caused by human error, including driving too fast
*for* *conditions*, which is *not* the same as driving too fast.


Of course it is.


autonomous vehicles will eliminate human error,


Impossible. They are designed by humans. The errors will just be elsewhere.


which not only means
fewer crashes, but it also means vehicles can be closer together,
thereby increasing road capacity and greatly reducing traffic problems.


You've just reinvented the train.

also, the person who was hit stepped in front of a moving vehicle,
assuming it would stop, and in an area where visibility was limited.
that's not a good strategy.

Can't blame the victim. The car could have and should have avoided her.

of course the victim can be blamed. stepping in front of a moving
vehicle is a bad idea, regardless what type of driver it is.

Funny, you argued exactly the opposite elsethread regarding drivers
"slamming on brakes".

two completely separate and unrelated things.

an autonomous vehicle could know about the pedestrian long before it
was even a factor, therefore no reason to slam on the brakes.


The only course of action the car could have done to avoid the crash was
an emergency brake, but it was disabled. So, yes it would have slammed on
the brakes.


absolutely wrong.

again, the death was the fault of the operator, who was watching tv
instead of paying attention to the road *and* the pedestrian, who
stepped in front of a moving vehicle outside of a marked crosswalk at
night, and who also tested positive for methamphetamine and marijuana.


Victim blaming and you're making **** up. You can assert all you like.
Doesn't make it right.

Crumple zones and airbags don't avoid
accidents. Improvements in both cars and roads have helped reduce accidents
and deaths.


cars are indeed safer and crash barriers on roadways help, but that's
really about it.


There are many more ways, but you're not prepared to accept anything
outside of your opinion.

humans still **** up, a lot.


That's never going to stop. Even in a fully autonomous world. The tech has
to be able to compensate for it. It just can't at the moment.


  #191  
Old February 17th 19, 04:17 PM posted to alt.comp.os.windows-10
Stephen Wolstenholme[_6_]
external usenet poster
 
Posts: 275
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

On Sun, 17 Feb 2019 15:09:24 -0000 (UTC), Chris
wrote:

Wolf K wrote:
On 2019-02-15 20:33, nospam wrote:
In ,
wrote:


not only that, but once humans get a license, it's forever.
No it isn't. If they seriously screw up, they get banned, made to resit
their test or even go to jail.
yes, if they seriously screw up. otherwise, it's forever.

[...]

In Ontario, you must pass an eye test, cognitive test, and sometimes
also a driving test, every two years from age 80 onward. I think it
should start at a younger age, but not enough over-65 voters agree with me.


Actually, older drivers are generally safe. It's the young (men) that need
to be carefully checked in terms of the biggest risk to other road users.


That's true. When I was young I always drove much too fast. I was
usually trying to impress any girl I was with. One girl decided I was
a maniac and left me to go home on the bus.

As I got older I started driving slower. I eventually gave up driving
and paying my carer to supply the transport or get a taxi.

Now I'm 70 I feel unsafe no matter who is driving!

Steve

--
http://www.npsnn.com

  #192  
Old February 17th 19, 05:04 PM posted to alt.comp.os.windows-10
Ken Blake[_5_]
external usenet poster
 
Posts: 2,221
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

On Sun, 17 Feb 2019 16:17:51 +0000, Stephen Wolstenholme
wrote:

On Sun, 17 Feb 2019 15:09:24 -0000 (UTC), Chris
wrote:


Actually, older drivers are generally safe. It's the young (men) that need
to be carefully checked in terms of the biggest risk to other road users.


That's true. When I was young I always drove much too fast. I was
usually trying to impress any girl I was with. One girl decided I was
a maniac and left me to go home on the bus.




I disagree. I think young drivers are often unsafe, and older drivers
are often unsafe because of poor vision, poor hearing, and poor
reflexes.

The safest drivers are the middle-aged.
  #193  
Old February 17th 19, 08:22 PM posted to alt.comp.os.windows-10
nospam
external usenet poster
 
Posts: 4,718
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

In article , Ken Blake
wrote:

Actually, older drivers are generally safe. It's the young (men) that need
to be carefully checked in terms of the biggest risk to other road users.


That's true. When I was young I always drove much too fast. I was
usually trying to impress any girl I was with. One girl decided I was
a maniac and left me to go home on the bus.



I disagree. I think young drivers are often unsafe, and older drivers
are often unsafe because of poor vision, poor hearing, and poor
reflexes.

The safest drivers are the middle-aged.


correct.
  #194  
Old February 17th 19, 08:22 PM posted to alt.comp.os.windows-10
nospam
external usenet poster
 
Posts: 4,718
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

In article , Wolf K
wrote:

not only that, but once humans get a license, it's forever.
No it isn't. If they seriously screw up, they get banned, made to resit
their test or even go to jail.
yes, if they seriously screw up. otherwise, it's forever.
[...]

In Ontario, you must pass an eye test, cognitive test, and sometimes
also a driving test, every two years from age 80 onward. I think it
should start at a younger age, but not enough over-65 voters agree with
me.
Actually, older drivers are generally safe. It's the young (men) that need
to be carefully checked in terms of the biggest risk to other road users.

nope. new drivers and elderly drivers are the highest risk due to lack
of skill and loss of ability, respectively.

[...]

Correct, although the fine-grained details are a bit more complex. Stats
show that accident rates increase after licensing, reaching a max around
5 years after obtaining the licence.


nope.

newly licensed drivers are the most dangerous, becoming safer as they
gain experience, bottoming out in the 30s-50s, and then rising again
around 60s.

another factor is that teens think they are immortal and take risks
that normal people would never do.

Since most people (in N. America)
get their licenses in the late teens to early 20s, "younger drivers"
mid- to late- 20s are a higher risk group. Stats also show that rates
decrease until about 40-45, then begin to rise rather slowly, then
increase faster from about age 60 onwards. By late 60s to early 70s,
accident rates are about the same for younger and older drivers.


crash rates are similar, but fatalities are higher:
https://one.nhtsa.gov/people/injury/olddrive/Figure8.gif

part of that is due to being old. what would be a minor injury for a
young person could be fatal for an elderly person.

One of the main reasons for increasing accident rates among older
drivers is reduced peripheral vision, which translates into a smaller
visual field, hence reduced awarenessof possible hazards. Hence "I
didn't see X" is often the truth. IMO older drivers need re-training to
improve old habits and develop new ones.


no. the main reason is their reaction time is not as good and they
drive slower than prevailing traffic, putting themselves and others at
risk.

peripheral vision is a very minor factor and easily resolved by turning
one's head.
  #195  
Old February 17th 19, 08:22 PM posted to alt.comp.os.windows-10
nospam
external usenet poster
 
Posts: 4,718
Default Microsoft 'Confirms' Windows 7 New Monthly Charge

In article , Chris
wrote:


not only that, but once humans get a license, it's forever.

No it isn't. If they seriously screw up, they get banned, made to resit
their test or even go to jail.


yes, if they seriously screw up. otherwise, it's forever.

the point is that there is no periodic testing to see if a driver is
still competent.


Every day is a test. If you don't crash, you've passed! Therefore are
clearly competent. 99.999% are perfectly fine and don't need periodic
testing. It's just a waste of time, money and effort.


nonsense.

most people don't seriously screw up, so they keep on driving even
though they are no longer able to safely do so.


If they don't screw up they are, by definition, safe.


nonsense.

in your world, if someone runs a red light and doesn't hit anything, it
was safe to do so.

if someone passed at age 16, all they need to do is pass an eye test
(with or without corrective lenses) and pay the fee.

the reaction time of someone in their 80s is not what it was when they
were a teen or 20something.


Older drivers are more careful and drive slower. It evens out.


nope. older drivers are an increasing risk and driving too slow is very
dangerous.

https://one.nhtsa.gov/people/injury/olddrive/Figure7.gif
https://one.nhtsa.gov/people/injury/olddrive/Figure8.gif

https://ars.els-cdn.com/content/image/1-s2.0-S0001457516303918-gr1.jpg

also, slower drivers are a much greater risk than faster drivers:
https://upload.wikimedia.org/wikipedia/commons/2/2f/Solomon_Curve.png

not shown in that chart, slower drivers are more likely to be in a
multi-vehicle crash, injuring or killing innocent others, while faster
drivers tend to be in a single vehicle crash, only affecting themselves
and any passengers, who were aware of the risks.

A human driver can pre-empt risky
situations by slowing down or preparing themselves to brake. They pick
up on lots of subtle visual cues.

no they definitely don't.

autonomous vehicles can 'see' far data than humans ever could possibly
do.

And yet function only half as well.


actually much better. significantly so.


I won't bother asking for proof. You never have any...


already provided, and i *always* have proof.

here's more, although you'll ignore it like you always do:
https://rmi.org/wp-content/uploads/2017/04/AV-graph-1.png

almost every crash involving an autonomous vehicle was due to a human
driven vehicle, either the operator taking over and driving it manually
or another human driven vehicle colliding with it.


Of the top of my head, I know three which directly car errors. Obviously
ignoring the uber car.

1) The very first Tesla car death, the car didn't see the left-turning
truck.


the nhtsa disagrees with you. he was driving too fast for conditions,
into a bright sky with limited visibility, and didn't notice the truck
either.

in other words, there probably would have been a crash without autonomy.

https://www.nytimes.com/2017/01/19/b...-autopilot-fat
al-crash.html
Tesla has said its camera failed to recognize the white truck against
a bright sky. But the agency essentially found that Mr. Brown was not
paying attention to the road. It determined he set his carıs cruise
control at 74 miles per hour about two minutes before the crash, and
should have had at least seven seconds to notice the truck before
crashing into it.
Neither Autopilot nor Mr. Brown hit the brakes. The agency said that
although Autopilot did not prevent the accident, the system performed
as it was designed and intended, and therefore did not have a defect.

https://www.tesla.com/blog/tragic-loss
...Neither Autopilot nor the driver noticed the white side of the
tractor trailer against a brightly lit sky, so the brake was not
applied.

that particular issue has been addressed:
https://www.washingtonpost.com/local...no-defect-foun
d-in-tesla-autopilot-system-used-in-deadly-florida-crash/2017/01/19/36e4
fa7c-de65-11e6-ad42-f3375f271c9c_story.html?noredirect=on
Months after the crash, Tesla sent out software upgrades that Musk
said ³very likely² would have prevented the Florida crash by making
better use of onboard radar technology. Radar can be a powerful
collision-avoidance tool but can be fooled.
....
Tesla also tightened its carsı approach to drivers who seem to not be
paying attention. Drivers who ignore an alarm more than three times
in an hour ³will have to park the car and restart it in order to
enable Autosteer,² Musk said.

2) there was another where the car drove into the back of a fire engine


this one?
https://www.wired.com/story/tesla-autopilot-why-crash-radar/
On Saturday August 25, a Tesla Model S crashed into a stopped
firetruck in San Jose, California. The two occupants inside the Tesla
sustained minor injuries, and the 37-year-old driver was arrested on
suspicion of driving under the influence of alcohol.

you can't blame the vehicle when the driver was drunk.

or this one, where the driver was not paying attention, ignored the
vehicle's warnings to put her hands on the steering wheel and used it
on a road where it was not designed to be used?
https://electrek.co/2018/05/16/tesla...autopilo-logs-
nhtsa-investigates/
€ About 1 minute and 22 seconds before the crash, she re-enabled
Autosteer and Cruise Control, and then, within two seconds, took her
hands off the steering wheel again. She did not touch the steering
wheel for the next 80 seconds until the crash happened; this is
consistent with her admission that she was looking at her phone at
the time.*
€ Contrary to the proper use of Autopilot, the driver did not pay
attention to the road at all times,*did not keep her hands on the
steering wheel, and she used it on a street with no center median and
with stoplight controlled intersections.*
....
The South Jordan Police Department confirmed today that the driver
received a traffic citation for ³failure to keep proper lookout.²

the police disagrees with you about who was at fault.

3) another Tesla fatality was when the driver was asleep and it accelerated
into a barrier.


as above, the vehicle warned the driver to put his hands on the
steering wheel, which were ignored. also, the crash barrier had been
previously hit (by a human driver) and not replaced.

https://www.ctvnews.ca/autos/tesla-s...-california-cr
ash-was-on-autopilot-1.3866430
The electric car maker said the driver, who was killed in the
accident, did not have his hands on the steering wheel for six
seconds before the crash, despite several warnings from the vehicle.
Tesla Inc. tells drivers that its Autopilot system, which can keep
speed, change lanes and self-park, requires drivers to keep their
eyes on the road and hands on the wheel in order to take control of
the vehicle to avoid accidents.

Tesla said its vehicle logs show the driver took no action to stop
the Model X SUV from crashing into a concrete lane divider.
....
The company said the crash was made worse by a missing or damaged
safety shield on the end of the freeway barrier that is supposed to
reduce the impact into the concrete lane divider.

nothing is perfect, and autonomous vehicles will still crash, however,
that will happen a *lot* less frequently than with human drivers.

human drivers crash far too often, killing more than 1.2 million people
every year. that needs to be reduced, and the only way to do that is
autonomous vehicles.

Collisions are caused by the impatient driving too fast or too close.


nope. crashes are caused by human error, including driving too fast
*for* *conditions*, which is *not* the same as driving too fast.


Of course it is.


it very definitely is not the same.

speeding is driving above the number on the signpost, normally
determined by politics, not safety. it's usually under-posted to
maximize ticket revenue.

driving too fast for conditions is driving at an unsafe speed given
current conditions, regardless of the number on the signpost.

if a road is posted at 50mph and it's foggy with low visibility, or
it's snowing and the roads are slippery, it would be unsafe to drive at
50 mph. depending how bad it is, 20mph might be too fast.

autonomous vehicles will eliminate human error,


Impossible. They are designed by humans. The errors will just be elsewhere.


designed by *many* humans, not just one, so no single error will be a
factor, plus the errors are caught ahead of time. it can also process a
lot more information in a lot less time.

the system could evaluate the probability of a collision of each
possible path and choose the best one in a fraction of a second. a
human cannot.

also, autonomous vehicles don't get tired, can't be impaired by alcohol
or other drugs, aren't distracted by passengers, scenery, fumbling with
the radio, phone or nav system (or paper maps in the old days), among
other things.

errors won't be zero, but it will be close to it.

which not only means
fewer crashes, but it also means vehicles can be closer together,
thereby increasing road capacity and greatly reducing traffic problems.


You've just reinvented the train.


nope.


The only course of action the car could have done to avoid the crash was
an emergency brake, but it was disabled. So, yes it would have slammed on
the brakes.


absolutely wrong.

again, the death was the fault of the operator, who was watching tv
instead of paying attention to the road *and* the pedestrian, who
stepped in front of a moving vehicle outside of a marked crosswalk at
night, and who also tested positive for methamphetamine and marijuana.


Victim blaming and you're making **** up. You can assert all you like.
Doesn't make it right.


it's not victim blaming nor is anything being made up.

it's very clear that had the operator of the uber vehicle been paying
attention, there would not have been a death. the pedestrian is also
partly at fault since her judgement was impaired due to drugs.

i see you snipped the proof.

here it is again:
https://www.abc15.com/news/region-so...tempe-police-r
elease-new-video-from-deadly-self-driving-uber-crash
Officers calculated that had Vasquez been paying attention, she could
have reacted 143 feet before impact and brought the SUV to a stop
about 42.6 feet before hitting Herzberg.

"This crash would not have occurred if Vasquez would have been
monitoring the vehicle and roadway conditions and was not
distracted," the report stated.
....
The report also says Herzberg unlawfully crossing the road at an
unmarked location was a factor in the crash.

and because of that, the operator of the uber vehicle is being charged
with manslaughter:

https://www.azcentral.com/story/news...22/self-drivin
g-uber-fatal-crash-prosecution-may-precedent-setting/726652002/
Investigators later determined the*crash would not have occurred if
Vasquez had been "monitoring the vehicle and roadway conditions and
was not distracted.²*

...A police report*indicated investigators are seeking a manslaughter
charge*against Vasquez.
....
"She was clearly derelict in her job and that she knew that was not
permitted and, as a result of that, an innocent person was killed,"
Marchant said.*

Crumple zones and airbags don't avoid
accidents. Improvements in both cars and roads have helped reduce accidents
and deaths.


cars are indeed safer and crash barriers on roadways help, but that's
really about it.


There are many more ways, but you're not prepared to accept anything
outside of your opinion.


that describes you.

humans still **** up, a lot.


That's never going to stop. Even in a fully autonomous world.


the point is that autonomous vehicles will **** up a *lot* less.

they do not have to be perfect, something which is impossible.

they only need to be better than humans, which is not that difficult.

the reality is that they will be *much* better than humans, which will
greatly reduce the number of crashes, injuries and fatalities.

The tech has
to be able to compensate for it. It just can't at the moment.


nobody said it's ready today.
 




Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off






All times are GMT +1. The time now is 04:30 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright İ2004-2024 PCbanter.
The comments are property of their posters.