Sometimes technology sucks!

Started by TrevL, October 20, 2020, 01:00:25 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

TrevL

Having had my car for some seven months, yesterday I finally plucked up courage to try the automatic parking assist feature.  All went well initially, although it seemed a little fast for my liking, and it was something to behold watching the steering wheel do it's own thing spinning left, and then right as it manouvered the car into the space outside my garage.  I said it seemed fast, so was just applying the brake and then crunch, the back made contact with the gate post :'(.  Just scratches to the plastic bumper fortunately, and a dented pride, but needless to say, I won't be using it again.  I've been parking in the same spot for 31 years-ish, and never hit a thing.  Bloody technology! :-X

Cheers, Trev.


Time flys like an arrow, fruit flies like a banana!

Trainfish

Quite ironic that there appears to be a sensor right in the middle of the damage. If it's 7 months from new I'd take it back and say it's defective  :thumbsup:
John

To follow the construction of my layout "Longcroft" from day 1, you'll have to catch the fish below first by clicking on it which isn't difficult right now as it's frozen!

<*))))><

joe cassidy

Probably made by the same factory that Dapol use  :D

TrevL

Sadly, I've had it seven months, but it's a 2017 car. :'(
Cheers, Trev.


Time flys like an arrow, fruit flies like a banana!

keithfre

Certainly doesn't hold out much hope for the safety of self-driving cars!

stevewalker

Even if they managed to make self-driving cars 100% safe, they'd still not be practical. Imagine setting out on the morning commute, only to be repeatedly slammed against the seatbelt, as cyclists, motorcyclists, drivers of normal cars and pedestrians, who can't be bothered to wait a moment, cut you up, speed through a junction or step out in front, knowing that the vehicle will stop. The only way I can see to prevent this, is for vehicles to be programmed to occasionally, randomly, under such circumstances, stop just too late - not enough to more than hurt, bruise or cause minor damage, but enough to discourage such actions. It is only the thought that a human driver may not stop in time that prevents such actions being common now.

njee20

What a bizarre notion. You think the only thing that stops total anarchy on the roads is the threat of collision?

Anyway - even 3 years old it may still be in warranty (or so close they'll honour it), as Trainfish said there is clearly a parking sensor square in the middle of that damage, which tallies with it not working. Do you get any sort of proximity reading from that particular sensor?

Mrs njee20 has the self parking and it is a bit surreal. Doesn't do the speed though, merely the steering, so you still control how quickly it does it. Not a fan myself. It also only likes spaces big enough to park a large bus in.

Skyline2uk

Being in the my industry I am often asked why planes can (and indeed have been for decades) fly themselves and yet cars cannot drive themselves.

The answers can be found above. No matter how busy our sky's are (or rather, were  :(), you don't get aircraft close to each other in the sky like cars on a road. The only time they are "queuing" and "giving way" is when they bumble around airports and yes, that's still done by the squishy bit sat up front. And yes they often get it wrong....somewhat more expensive than the clang above (no disrespect to OP).

Just too many variables when there are some auto driving cars and some manual driving cars, it's not going to happen any time soon!

Skyline2uk

njee20

At risk of continuing that tangent there's also the fascinating hypothetical morality questions around the car deciding whether to kill its occupants or a pedestrian, and at what point that balance shifts! If a whole school ran in front of the car should it swerve and crash?!

zwilnik

Quote from: njee20 on October 20, 2020, 05:16:49 PM
At risk of continuing that tangent there's also the fascinating hypothetical morality questions around the car deciding whether to kill its occupants or a pedestrian, and at what point that balance shifts! If a whole school ran in front of the car should it swerve and crash?!

That one's a bit of a 'poorly understood science by the media' one. In engineering terms, if the AI driving the car had to make that decision it means it's already driving too fast for the conditions and has failed. It's exactly the same moral issue as a human driver in that scenario. You've already killed someone by driving poorly, it's just a matter of who.

Or simply put. An Optimist says the glass is half full, a Pessimist says it's half empty. The Engineer says it's twice as large as it needs to be :)

Malc

Quote from: zwilnik on October 20, 2020, 05:34:16 PM

Or simply put. An Optimist says the glass is half full, a Pessimist says it's half empty. The Engineer says it's twice as large as it needs to be :)
This engineer says you need to pour more in the glass. Half measures huh!
The years have been good to me, it was the weekends that did the damage.

njee20

Quote from: zwilnik on October 20, 2020, 05:34:16 PM
Quote from: njee20 on October 20, 2020, 05:16:49 PM
At risk of continuing that tangent there's also the fascinating hypothetical morality questions around the car deciding whether to kill its occupants or a pedestrian, and at what point that balance shifts! If a whole school ran in front of the car should it swerve and crash?!

That one's a bit of a 'poorly understood science by the media' one. In engineering terms, if the AI driving the car had to make that decision it means it's already driving too fast for the conditions and has failed. It's exactly the same moral issue as a human driver in that scenario. You've already killed someone by driving poorly, it's just a matter of who.

Or simply put. An Optimist says the glass is half full, a Pessimist says it's half empty. The Engineer says it's twice as large as it needs to be :)

Whilst I know what you mean, until you control all the variables in the system there remains a chance, although it will of course be vanishingly small. The alternative is that self driving cars do not exceed walking pace, and stop whenever other vehicles are nearby. Although that technically 'sacrifices' the occupants anyway! Any engineer who thinks there is zero chance of collision (and therefore a potential 'decision' to be made) is wrong.

One of the difficulties of automating cars is the sheer number of variables in the system. Much easier to automate trains.

Paddy

That is a real shame and so frustrating.

Personally speaking I have no desire for automated cars that drive and/or park themselves.  I will make an exception for Climate Control though.  ;)

This is not so much driven by a fear of the technology (although, as someone who has spent their entire career in the software industry I would not trust it) but rather I enjoy driving my car.  If I wanted to be a passenger then I would get a taxi, bus or train.  Even automatic gearboxes are step too far for me although sadly I think the gear shift will become a thing of the past.

My biggest bug bear is the replacement of all the knobs and switches in new cars by a touch screen.  IMHO I believe they offer a less tactile and harder to use experience.  In addition, I fear they will prove to be dangerous especially in right hand drive vehicles.  If you are not allowed to use a phone whilst driving then how on earth is a tablet any better?

Ah well, I guess this is progress and I am just showing my age...  Maybe that is why I like steam engines.  :D

Kind regards

Paddy
HOLLERTON JUNCTION (SHED 13C)
London Midland Region
http://www.ngaugeforum.co.uk/SMFN/index.php?topic=11342.0


BARRIES'S TRAIN SHED - HIGHLY RECOMMENDED
https://www.youtube.com/channel/UChVzVVov7HJOrrZ6HRvV2GA

Trainfish

Quote from: Malc on October 20, 2020, 05:41:18 PM
Quote from: zwilnik on October 20, 2020, 05:34:16 PM

Or simply put. An Optimist says the glass is half full, a Pessimist says it's half empty. The Engineer says it's twice as large as it needs to be :)
This engineer says you need to pour more in the glass. Half measures huh!

This engineer says you need a bigger glass. Maybe something like this which holds 2.5 pints  :thumbsup: :beers:

John

To follow the construction of my layout "Longcroft" from day 1, you'll have to catch the fish below first by clicking on it which isn't difficult right now as it's frozen!

<*))))><

stevewalker

Quote from: zwilnik on October 20, 2020, 05:34:16 PM
Quote from: njee20 on October 20, 2020, 05:16:49 PM
At risk of continuing that tangent there's also the fascinating hypothetical morality questions around the car deciding whether to kill its occupants or a pedestrian, and at what point that balance shifts! If a whole school ran in front of the car should it swerve and crash?!
That one's a bit of a 'poorly understood science by the media' one. In engineering terms, if the AI driving the car had to make that decision it means it's already driving too fast for the conditions and has failed. It's exactly the same moral issue as a human driver in that scenario. You've already killed someone by driving poorly, it's just a matter of who.

Not at all. If an AI car (or a normal one) is approaching a green light, where there is a red light for pedestrians, it cannot be expected to slow so far that even in the last second, it could stop if a pedestrian suddenly set off against the lights. It is not that the AI has failed, it is that someone else has broken the rules. At that point the decision becomes do you risk the occupants of the vehicle, that have themselves done nothing wrong or the pedestrian who has.

Please Support Us!
June Goal: £100.00
Due Date: Jun 30
Total Receipts: £0.00
Below Goal: £100.00
Site Currency: GBP
 0%