Video of Tesla crash shows exactly why Autopilot isn't true self-driving tech: Reminder: Autopilot doesn't mean your car can totally drive itself

Video of Tesla crash shows exactly why Autopilot isn't true self-driving tech: Reminder: Autopilot doesn't mean your car can totally drive itself - Hallo guys! Free Your Mind, This time, i write about Video of Tesla crash shows exactly why Autopilot isn't true self-driving tech: Reminder: Autopilot doesn't mean your car can totally drive itself, from any sources that i think you can be used for your information. I hope everything in this posting can be very useful for you. Ok, here it is! .

Title : Video of Tesla crash shows exactly why Autopilot isn't true self-driving tech: Reminder: Autopilot doesn't mean your car can totally drive itself

You can also look


Video of Tesla crash shows exactly why Autopilot isn't true self-driving tech: Reminder: Autopilot doesn't mean your car can totally drive itself

Article Technology, Video of Tesla crash shows exactly why Autopilot isn't true self-driving tech: Reminder: Autopilot doesn't mean your car can totally drive itself.Image: tyler essary/mashableBy Brett Williams2017-03-03 22:01:46 UTC Listen up, Tesla fans.  Your favorite carmaker's Autopilot is a super-smart marriage of high tech hardware and software that provides one of the best semi-autonomous driving experiences currently on the road.  That said: It is not a fully self-driving system and shouldn't be used as such. A video posted to the Tesla Motors subreddit and spotted by Electrek shows exactly what can happen when a Tesla Model S driver puts too much trust in his car's Autopilot. It's a reminder that a human driver is always needed in a Tesla, even when the Autopilot system is engaged.   The accident first came to light when a redditor posted pictures of his thrashed Model S. He claimed his first-generation Autopilot never alerted him to take manual control of the car — the system is supposed to cue drivers to take over when the car can't handle road conditions — before it "misread the road" and sideswiped the highway barrier. He walked away from the crash with some minor bruises. A few days later, the video of the alleged incident was posted to the subreddit. The footage, taken by the dashcam of a car directly behind the Model S, shows a poorly-marked construction zone playing havoc with the Autopilot system, running the car into the lane barrier. You can watch for yourself here.  Other redditors were quick to point out the original poster never mentioned a construction zone when he shared the pics, let alone the shoddy road markings that made the particular stretch of highway tricky for even attentive human drivers. Most commenters agreed on one thing: the driver wasn't paying enough attention to the road — and was therefore using Autopilot the wrong way. They're right. As cool as it is that you can flip on Autopilot on the highway and let the car handle most of the work, it's not safe to give it free rein, or even take your hands off the wheel.   Tesla's first generation Autopilot system has Level 2 autonomy, which means the car can handle some of the driving responsibilities on its own, mostly for breaking and keeping the car in its lane.  But a human still needs to be in the driver's seat keeping an eye on the road, ready to take the wheel when the machine can't handle the conditions.  Some groups, like the German government, have argued that "Autopilot" is a misnomer at best, and potentially fatal false advertising at worst. The system came under fire after the first self-driving fatality last year, when a distracted driver's Model S drove underneath a white tractor trailer. A National Highway Traffic Safety Administration (NHTSA) investigation, however, cleared Tesla of any liability, deeming Autopilot safe if used correctly.    Tesla has bigger plans for the system with Enhanced Autopilot. Updated sensor hardware now available in new models, along with incremental software updates that will be released throughout this year, will purportedly bring the system to Level 5 autonomy. That means full-on self-driving cars that can handle any and all road conditions just as well as a human driver can — potentially better.    For now, though, the full Enhanced Autopilot and Level 5 autonomy isn't here — so until it is, remember that your "self-driving" cars still need your full attention. http://feeds.mashable.com/~r/mashable/tech/~3/SUQ-ENJFHhY/


That's all information about Video of Tesla crash shows exactly why Autopilot isn't true self-driving tech: Reminder: Autopilot doesn't mean your car can totally drive itself

And i hope the information about Video of Tesla crash shows exactly why Autopilot isn't true self-driving tech: Reminder: Autopilot doesn't mean your car can totally drive itself can be usefull for you guys. Alright, this is article may so useful for you guys. Thank You.

You are reading the article about Video of Tesla crash shows exactly why Autopilot isn't true self-driving tech: Reminder: Autopilot doesn't mean your car can totally drive itself and the permalink of the article is https://ai-freeurmind.blogspot.com/2017/03/video-of-tesla-crash-shows-exactly-why.html I hope this article can be helpful and useful.

Tag : ,

0 Response to "Video of Tesla crash shows exactly why Autopilot isn't true self-driving tech: Reminder: Autopilot doesn't mean your car can totally drive itself"