User Tag List

Page 1 of 2 12 LastLast
Results 1 to 20 of 26

Thread: Autonomous Vehicles

  1. #1
    Join Date
    Sep 2011
    Location
    Middle GA
    Posts
    7,289
    Post Thanks / Like
    Mentioned
    20 Post(s)
    Tagged
    0 Thread(s)

    Default Autonomous Vehicles

    We've talked about self driving cars a bit here and there. Lets do it some more.

    The recent review from the NTSB about the incident where a self driving Uber hit and killed a pedestrian/bicyclist (she was pushing a bike) has some interesting revelations (to me, anyhow).

    NTSB: Uber Self-Driving Car Had Disabled Emergency Brake System Before Fatal Crash : The Two-Way : NPR

    To review:
    A Volvo fitted with Uber's self driving system hit and killed a woman. She was walking across the road pushing a bicycle. It was after dark. The "driver" wasn't looking at the road until just before impact.

    I'm not trying to sound harsh or insensitive, but the woman who was hit and killed is primarily the one to blame here. She was walking across a four lane road in the dark and wasn't looking out for traffic. It's sad she's dead, but, had she been watching for traffic, she'd still be alive. It's not like the car randomly swerved and jumped the curb and hit her - she walked in front of it. That said, the car should have seen her and acted accordingly - I mean if I'm driving down the road and someone is walking across the road, and I see them, I try not to hit them, even if they shouldn't be walking out in front of me. Happens all the time actually, and so far I've yet to mow anyone down.

    Now, lets discuss the car. What's surprising to me is how the Uber system works. It recognized something was in the road ~6secs before it hit her. It registered a bicycle a little over 1 second before it hit her.

    The two big "WTFs" in my mind:
    1. The car did not brake or use any evasive maneuvering. The Volvo has factory installed automatic emergency braking that will act on it's own, but the Uber system turns it off "to prevent erratic behavior". The system relies on the "driver" to intervene with braking and/or steering.
    2. The Uber system does not give any alerts to the driver that it has seen something in the road.


    In other words, the car 'thought' "oh look there's a person directly in my path of travel" and then did nothing.

    It seems to me that #1 (auto braking is turned off) is done because the system isn't good enough to be trusted. And with that in mind, I can't for the life of me understand why #2 would be true. Why wouldn't it alert the driver?

    With self driving cars, once people get used to them and start trusting them they will not be paying attention to the road. Therefore, you cannot count on them to do emergency braking or maneuvering. That's exactly what happened here, the "driver" wasn't looking at the road for quite some time. And if these systems do require 100% attention from the driver, then what's the point at all? Because they'll certainly pay even less attention than they do now with a self driving car vs one they have to drive themselves.

    Further, I wonder what the process was like that approved the use of these things. Did the NTSB or Arizona DOT know that the system would turn off automatic emergency braking, and not alert the driver to potential dangers?
    Last edited by dgaddis; 05-25-2018 at 10:21 AM.
    Dustin Gaddis
    www.MiddleGaEpic.com
    Why do people feel the need to list all of their bikes in their signature?

  2. #2
    Join Date
    Aug 2013
    Posts
    123
    Post Thanks / Like
    Mentioned
    1 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    I read an article a few days ago similar to the one you referenced. I was surprised to read that the system(s) had been disabled on the car. When the actual accident occurred, I was stunned that it happened the way it did. These automated systems were supposed to save lives and in this case, it didn’t. Now we find out stuff was disabled. I wonder if the driver knew.

    I’m not sure what to believe anymore related to the news reports on these cars: testing, miles driven with no accidents, etc. Are they working or not? I know there are teething pains with any new system of this magnitude, but I wonder what true reliability of this stuff is.

    Personally, I think automated cars and the push for them is pure snake oil. It isn’t going to work. It can’t work just by putting gadgets in a car. There will have to be infrastructure added to the road system itself and right now in my area, merely filing craters/potholes seems to be a chore. If the system doesn’t work everywhere, it shouldn’t be used anywhere.

    I could go on. I’m not normally a betting person, but I would wager the farm that in 10 years, 20 years, whatever, automated cars will not have saved any more lives than proper driver training. IF you could even quantify either.

  3. #3
    Join Date
    Aug 2014
    Location
    Kingston, Ontario, Canada
    Posts
    1,274
    Post Thanks / Like
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    Quote Originally Posted by becomingblue View Post
    I would wager the farm that in 10 years, 20 years, whatever, automated cars will not have saved any more lives than proper driver training. IF you could even quantify either.
    with the increasing amount of distracted driving I would take that bet.
    Matt Moore

  4. #4
    Join Date
    Oct 2012
    Location
    Puyallup, WA
    Posts
    3,565
    Post Thanks / Like
    Mentioned
    1 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    Quote Originally Posted by becomingblue View Post
    I read an article a few days ago similar to the one you referenced. I was surprised to read that the system(s) had been disabled on the car. When the actual accident occurred, I was stunned that it happened the way it did. These automated systems were supposed to save lives and in this case, it didn’t. Now we find out stuff was disabled. I wonder if the driver knew.

    I’m not sure what to believe anymore related to the news reports on these cars: testing, miles driven with no accidents, etc. Are they working or not? I know there are teething pains with any new system of this magnitude, but I wonder what true reliability of this stuff is.

    Personally, I think automated cars and the push for them is pure snake oil. It isn’t going to work. It can’t work just by putting gadgets in a car. There will have to be infrastructure added to the road system itself and right now in my area, merely filing craters/potholes seems to be a chore. If the system doesn’t work everywhere, it shouldn’t be used anywhere.

    I could go on. I’m not normally a betting person, but I would wager the farm that in 10 years, 20 years, whatever, automated cars will not have saved any more lives than proper driver training. IF you could even quantify either.
    The only way it's going to save lives is what I see as the eventual end-game of autonomous vehicles -- all other users will be banned from the public roadways, including motorcyclists, pedestrians and cyclists.
    DT

    http://www.mjolnircycles.com/

    Some are born to move the world to live their fantasies...

    "the fun outweighs the suck, and the suck hasn't killed me yet." -- chasea

    "Sometimes, as good as it feels to speak out, silence is the only way to rise above the morass. The high road is generally a quiet route." -- echelon_john

  5. #5
    Join Date
    Aug 2013
    Posts
    123
    Post Thanks / Like
    Mentioned
    1 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    Agreed. But, when they pry my cold dead hands from the handlebars, as they say.

  6. #6
    Join Date
    May 2008
    Location
    DC
    Posts
    29,830
    Post Thanks / Like
    Mentioned
    58 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    Saw something similar to this concept being hauled on a flatbed in DC today >>


  7. #7
    Join Date
    Apr 2011
    Location
    Hillsdale NY
    Posts
    25,532
    Post Thanks / Like
    Mentioned
    74 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    I'm not trying to sound harsh or insensitive, but the woman who was hit and killed is primarily the one to blame here. She was walking across a four lane road in the dark and wasn't looking out for traffic. It's sad she's dead, but, had she been watching for traffic, she'd still be alive. It's not like the car randomly swerved and jumped the curb and hit her - she walked in front of it.
    I don't agree that she is primarily to blame. The driver of the car wasn't paying attention to where the car was going. The car was an experimental vehicle. Wouldn't paying attention to where an experimental vehicle was going be the primary job of the driver? It is an experimental vehicle, not an actual finished done ready for prime time real world use car. I know that area - it isn't ever totally dark there. It is extremely confusing in terms of all the various sidewalks, some of them on the median, bike paths, bike lanes, turning lanes, impending traffic light, etc. Even though signs say don't cross over to the median from the sidewalk, what if you are riding in the bike lane southbound and need to get over to the northbound bike lane? They look perfect for that. And with all the ASU students in that area with apartments and nearby school buildings, you have to pay attention when driving around there. The lack of reflectors correctly positioned on the bike and the presence of drugs in the system of the woman killed are not conditions that make her primarily to blame, especially when there is no toxicology of the driver of the autonomous vehicle (evidently.) And wearing dark clothing also does not mean she is to blame for getting hit. At most, she showed bad judgment assuming that oncoming cars were either going to allow her to cross the street there or that she had enough time to make it across the street at that point. But still she isn't to blame for her death. Bad judgment has nothing to do with blame. Bad judgment could however be used to describe what is an unfortunate accident. Bad judgment which the driver of the autonomous car also showed by forgetting that even though she was sitting in an autonomous car, she was still the driver of that car, not a bystander in the driver's seat. Two people + two cases of bad judgment = one death.

    The one to blame are the officials who allowed this car on the road to use unsuspecting people as guinea pigs with no warning that the car approaching them is being operated by a computer program. That car should have been covered with flashing lights and emitting a loud alarm sound to alert people to the possibility that this car might see them and not stop and that no one in the car would stop it.

    Screen Shot 2018-05-25 at 8.40.42 PM.jpg
    Last edited by j44ke; 05-25-2018 at 09:23 PM.
    Jorn Ake
    poet

    Flickr
    Books

  8. #8
    Join Date
    Oct 2008
    Location
    Northwest AZ
    Posts
    6,054
    Post Thanks / Like
    Mentioned
    19 Post(s)
    Tagged
    1 Thread(s)

    Default Re: Autonomous Vehicles

    I'm encouraged by autonomous vehicles for certain circumstances. Many cities have digital speed limit signs on highways that are subject to heavy traffic. Speeds are calculated based on the most efficient movement of traffic and if cars actually followed those speeds, traffic would move faster. Right now, there are many drivers that are smooth and steady in traffic while there are others that surge and stop. Autonomous cars could be controlled by traffic computers to maintain a steady flow, even if it's 35mph, it's better than stop and go.
    Retired Sailor, Marine dad, semi-professional cyclist, fly fisherman, and Indian School STEM teacher.
    Assistant Operating Officer at Farm Soap homemade soaps. www.farmsoap.com

  9. #9
    Join Date
    Sep 2010
    Posts
    4,836
    Post Thanks / Like
    Mentioned
    16 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    The blame here is Uber's. They consistently display a cavalier attitude towards regulations. They aggressively pushed to get these cars out. They aggressively went from 2 drivers to 1 driver monitoring the car in a short order. This company needs to be fined heavily. They are displaying the classic better ask for forgiveness then permission. If the punishment is not heavy, then the cavalier bet vs regulations was the right one. It's that simple.

    (Has anyone seen the write up on what the driver was looking at? Was it a display screen for the car show stats that he needed to check? If so, going to one driver was not well thought out since it required eyes off road)

  10. #10
    Join Date
    Sep 2011
    Location
    Middle GA
    Posts
    7,289
    Post Thanks / Like
    Mentioned
    20 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    Quote Originally Posted by j44ke View Post
    I don't agree that she is primarily to blame...
    Just to be clear, I wasn't saying she deserved it, or trying to lessen the value of her life. Just because she made bad choices doesn't mean she should have been killed for it. I didn't mention the toxicology report because whatever it had said, I still feel like she shouldn't have been in the road. Or at the very least she shouldn't have assumed traffic would slow to let her across. I say she's 'primarily' to blame because she was acting outside of the expected norms, and in the video it didn't look like she ever even looked to see if traffic was coming.

    But you're right, it was the combination of several mistakes. The driver, and Uber, and the officials that allowed the car on the roads are certainly partially to blame for her death too.

    If she didn't walk in front of the car, she wouldn't have been hit.
    If the car had braked, maybe it wouldn't have hit her.
    Had the driver been paying attention to the road, maybe she wouldn't have been hit.
    Dustin Gaddis
    www.MiddleGaEpic.com
    Why do people feel the need to list all of their bikes in their signature?

  11. #11
    Join Date
    Apr 2011
    Location
    Hillsdale NY
    Posts
    25,532
    Post Thanks / Like
    Mentioned
    74 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    I'm also saying that the expected norms in that area are that cars will slow and yield to pedestrians even outside of crosswalks, because there are so many of pedestrians (admittedly many of them distracted) in that area associated with the presence of a major university. When I first arrived at ASU, I was astonished by how willing car drivers were to stop for pedestrians. It was outside of my expectations as normal behavior. My normative experience (Richmond, VA) was that you often couldn't cross the street without being an another car. So regardless of the pedestrian's errors and those of the driver's, the expectation of the pedestrian could have easily been that the car driver would yield to her crossing the road. However, because there was no indication that this was a "special" kind of car, any pedestrian crossing the road would not have known that whatever expectations might have been reasonable in that area were not a part of the car's autonomous behavior.

    I also think had the car hit and killed an ASU student, the accident would have been a much bigger problem for the car owner and the city.
    Last edited by j44ke; 05-28-2018 at 09:51 AM.
    Jorn Ake
    poet

    Flickr
    Books

  12. #12
    Join Date
    Sep 2011
    Location
    Middle GA
    Posts
    7,289
    Post Thanks / Like
    Mentioned
    20 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    Quote Originally Posted by j44ke View Post
    I also think had the car hit and killed an ASU student, the accident would have been a much bigger problem for the car owner and the city.
    Agree 100%. Uber got lucky that it was "just a homeless person". Most folks have already forgotten about the whole thing.
    Dustin Gaddis
    www.MiddleGaEpic.com
    Why do people feel the need to list all of their bikes in their signature?

  13. #13
    Join Date
    Sep 2010
    Posts
    4,836
    Post Thanks / Like
    Mentioned
    16 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    The news clip referenced in the story.



    From Bberg Businessweek

    Self-Driving Cars Are Starting to Look Like a $100 Billion Bust
    2022-10-06 04:01:10.461 GMT


    By Max Chafkin
    (Bloomberg Businessweek) -- The first car woke Jennifer
    King at 2 a.m. with a loud, high‑pitched hum. “It sounded like a
    hovercraft,” she says, and that wasn’t the weird part. King
    lives on a dead-end street at the edge of the Presidio, a 1,500-
    acre park in San Francisco where through traffic isn’t a thing.
    Outside she saw a white Jaguar SUV backing out of her driveway.
    It had what looked like a giant fan on its roof—a laser
    sensor—and bore the logo of Google’s driverless car division,
    Waymo.
    She was observing what looked like a glitch in the self-
    driving software: The car seemed to be using her property to
    execute a three-point turn. This would’ve been no biggie, she
    says, if it had happened once. But dozens of Google cars began
    doing the exact thing, many times, every single day.
    King complained to Google that the cars were driving her
    nuts, but the K-turns kept coming. Sometimes a few of the SUVs
    would show up at the same time and form a little line, like an
    army of zombie driver’s-ed students. The whole thing went on for
    weeks until last October, when King called the local CBS
    affiliate and a news crew broadcast the scene. “It is kind of
    funny when you watch it,” the report began. “And the neighbors
    are certainly noticing.” Soon after, King’s driveway was hers
    again.

    Waymo disputes that its tech failed and said in a statement
    that its vehicles had been “obeying the same road rules that any
    car is required to follow.” The company, like its peers in
    Silicon Valley and Detroit, has characterized incidents like
    this as isolated, potholes on the road to a steering-wheel-free
    future. Over the course of more than a decade, flashy demos from
    companies including Google, GM, Ford, Tesla, and Zoox have
    promised cars capable of piloting themselves through chaotic
    urban landscapes, on highways, and in extreme weather without
    any human input or oversight. The companies have suggested
    they’re on the verge of eliminating road fatalities, rush-hour
    traffic, and parking lots, and of upending the $2 trillion
    global automotive industry.
    Omny Studio: Self-Driving Cars Are Looking Like a ...
    Listen to the story.
    It all sounds great until you encounter an actual robo-taxi
    in the wild. Which is rare: Six years after companies started
    offering rides in what they’ve called autonomous cars and almost
    20 years after the first self-driving demos, there are
    vanishingly few such vehicles on the road. And they tend to be
    confined to a handful of places in the Sun Belt, because they
    still can’t handle weather patterns trickier than Partly Cloudy.
    State-of-the-art robot cars also struggle with construction,
    animals, traffic cones, crossing guards, and what the industry
    calls “unprotected left turns,” which most of us would call
    “left turns.”
    The industry says its Derek Zoolander problem applies only
    to lefts that require navigating oncoming traffic. (Great.) It’s
    devoted enormous resources to figuring out left turns, but the
    work continues. Earlier this year, Cruise LLC—majority-owned by
    General Motors Co.—recalled all of its self-driving vehicles
    after one car’s inability to turn left contributed to a crash in
    San Francisco that injured two people. Aaron McLear, a Cruise
    spokesman, says the recall “does not impact or change our
    current on-road operations.” Cruise is planning to expand to
    Austin and Phoenix this year. “We’ve moved the timeline to the
    left for what might be the first time in AV history,” McLear
    says.
    Cruise didn’t release the video of that accident, but
    there’s an entire social media genre featuring self-driving cars
    that become hopelessly confused. When the results are less
    serious, they can be funny as hell. In one example, a Waymo car
    gets so flummoxed by a traffic cone that it drives away from the
    technician sent out to rescue it. In another, an entire fleet of
    modified Chevrolet Bolts show up at an intersection and simply
    stop, blocking traffic with a whiff of Maximum Overdrive. In a
    third, a Tesla drives, at very slow speed, straight into the
    tail of a private jet.
    This, it seems, is the best the field can do after
    investors have bet something like $100 billion, according to a
    McKinsey & Co. report. While the industry’s biggest names
    continue to project optimism, the emerging consensus is that the
    world of robo-taxis isn’t just around the next unprotected
    left—that we might have to wait decades longer, or an eternity.
    “It’s a scam,” says George Hotz, whose company Comma.ai
    Inc. makes a driver-assistance system similar to Tesla Inc.’s
    Autopilot. “These companies have squandered tens of billions of
    dollars.” In 2018 analysts put the market value of Waymo LLC,
    then a subsidiary of Alphabet Inc., at $175 billion. Its most
    recent funding round gave the company an estimated valuation of
    $30 billion, roughly thesame as Cruise. Aurora Innovation Inc.,
    a startup co-founded by Chris Urmson, Google’s former
    autonomous-vehicle chief, has lost more than 85% since last year
    and is now worth less than $3 billion. This September a leaked
    memo from Urmson summed up Aurora’s cash-flow struggles and
    suggested it might have to sell out to a larger company. Many of
    the industry’s most promising efforts have met the same fate in
    recent years, including Drive.ai, Voyage, Zoox, and Uber’s self-
    driving division. “Long term, I think we will have autonomous
    vehicles that you and I can buy,” says Mike Ramsey, an analyst
    at market researcher Gartner Inc. “But we’re going to be old.”

    Our driverless future is starting to look so distant that
    even some of its most fervent believers have turned apostate.
    Chief among them is Anthony Levandowski, the engineer who more
    or less created the model for self-driving research and was, for
    more than a decade, the field’s biggest star. Now he’s running a
    startup that’s developing autonomous trucks for industrial
    sites, and he says that for the foreseeable future, that’s about
    as much complexity as any driverless vehicle will be able to
    handle. “You’d be hard-pressed to find another industry that’s
    invested so many dollars in R&D and that has delivered so
    little,” Levandowski says in an interview. “Forget about
    profits—what’s the combined revenue of all the robo-taxi, robo-
    truck, robo-whatever companies? Is it a million dollars? Maybe.
    I think it’s more like zero.”
    In some ways, Levandowski is about as biased a party as
    anyone could be. His ride on top of the driverless wave ended in
    ignominy, after he moved from Google to Uber Technologies Inc.
    and his old bosses sued the crap out of his new ones for, they
    said, taking proprietary research along with him. The
    multibillion-dollar lawsuit and federal criminal case got
    Levandowski fired, forced him into bankruptcy, and ended with
    his conviction for stealing trade secrets. He only avoided
    prison thanks to a presidential pardon from Donald Trump.
    On the other hand, Levandowski is also acknowledged, even
    by his detractors, as a pioneer in the industry and the person
    most responsible for turning driverless cars from a science
    project into something approaching a business. Eighteen years
    ago he wowed the Pentagon with a kinda-sorta-driverless
    motorcycle. That project turned into Google’s driverless Prius,
    which pushed dozens of others to start self-driving car
    programs. In 2017, Levandowski founded a religion called the Way
    of the Future, centered on the idea that AI was becoming
    downright godlike.
    What shattered his faith? He says that in the years after
    his defenestration from Uber, he began to compare the industry’s
    wild claims to what seemed like an obvious lack of progress with
    no obvious path forward. “It wasn’t a business, it was a hobby,”
    he says. Levandowski maintains that somebody, eventually, will
    figure out how to reliably get robots to turn left, and all the
    rest of it. “We’re going to get there at some point. But we have
    such a long way to go.”
    For the companies that invested billions in the driverless
    future that was supposed to be around the next corner, “We’ll
    get there when we get there” isn’t an acceptable answer. The
    industry that grew up around Levandowski’s ideas can’t just
    reverse course like all those Google cars outside Jennifer
    King’s bedroom. And the companies that bet it all on those ideas
    might very well be stuck in a dead end.
    All self-driving car demos are more or less the same. You
    ride in the back seat and watch the steering wheel move on its
    own while a screen shows you what the computer is “seeing.” On
    the display, little red or green boxes hover perfectly over
    every car, bike, jaywalker, stoplight, etc. you pass. All this
    input feels subliminal when you’re driving your own car, but on
    a readout that looks like a mix between the POVs of the
    Terminator and the Predator, it’s overwhelming. It makes driving
    feel a lot more dangerous, like something that might well be
    better left to machines. The car companies know this, which is
    why they do it. Amping up the baseline tension of a drive makes
    their software’s screw-ups seem like less of an outlier, and the
    successes all the more remarkable.
    One of the industry’s favorite maxims is that humans are
    terrible drivers. This may seem intuitive to anyone who’s taken
    the Cross Bronx Expressway home during rush hour, but it’s not
    even close to true. Throw a top-of-the-line robot at any
    difficult driving task, and you’ll be lucky if the robot lasts a
    few seconds before crapping out.

    “Humans are really, really good drivers—absurdly good,”
    Hotz says. Traffic deaths are rare, amounting to one person for
    every 100 million miles or so driven in the US, according to the
    National Highway Traffic Safety Administration. Even that number
    makes people seem less capable than they actually are. Fatal
    accidents are largely caused by reckless behavior—speeding,
    drunks, texters, and people who fall asleep at the wheel. As a
    group, school bus drivers are involved in one fatal crash
    roughly every 500 million miles. Although most of the accidents
    reported by self-driving cars have been minor, the data suggest
    that autonomous cars have been involved in accidents more
    frequently than human-driven ones, with rear-end collisions
    being especially common. “The problem is that there isn’t any
    test to know if a driverless car is safe to operate,” says
    Ramsey, the Gartner analyst. “It’s mostly just anecdotal.”
    Waymo, the market leader, said last year that it had driven
    more than 20 million miles over about a decade. That means its
    cars would have to drive an additional 25 times their total
    before we’d be able to say, with even a vague sense of
    certainty, that they cause fewer deaths than bus drivers. The
    comparison is likely skewed further because the company has done
    much of its testing in sunny California and Arizona.
    “You think the computer can see everything and can
    understand what’s going to happen next. But computers are still
    really dumb”
    For now, here’s what we know: Computers can run
    calculations a lot faster than we can, but they still have no
    idea how to process many common roadway variables. People
    driving down a city street with a few pigeons pecking away near
    the median know (a) that the pigeons will fly away as the car
    approaches and (b) that drivers behind them also know the
    pigeons will scatter. Drivers know, without having to think
    about it, that slamming the brakes wouldn’t just be
    unnecessary—it would be dangerous. So they maintain their speed.
    What the smartest self-driving car “sees,” on the other
    hand, is a small obstacle. It doesn’t know where the obstacle
    came from or where it may go, only that the car is supposed to
    safely avoid obstacles, so it might respond by hitting the
    brakes. The best-case scenario is a small traffic jam, but
    braking suddenly could cause the next car coming down the road
    to rear-end it. Computers deal with their shortcomings through
    repetition, meaning that if you showed the same pigeon scenario
    to a self-driving car enough times, it might figure out how to
    handle it reliably. But it would likely have no idea how to deal
    with slightly different pigeons flying a slightly different way.
    The industry uses the phrase “deep learning” to describe
    this process, but that makes it sound more sophisticated than it
    is. “What deep learning is doing is something similar to
    memorization,” says Gary Marcus, a New York University
    psychology professor who studies artificial intelligence and the
    limits of self-driving vehicles. “It only works if the
    situations are sufficiently akin.”
    And the range of these “edge cases,” as AI experts call
    them, is virtually infinite. Think: cars cutting across three
    lanes of traffic without signaling, or bicyclists doing the
    same, or a deer ambling alongside the shoulder, or a low-flying
    plane, or an eagle, or a drone. Even relatively easy driving
    problems turn out to contain an untold number of variations
    depending on weather, road conditions, and human behavior. “You
    think roads are pretty similar from one place to the next,”
    Marcus says. “But the world is a complicated place. Every
    unprotected left is a little different.”
    Self-driving companies have fallen back on shortcuts. In
    lieu of putting more cars on the road for longer, they run
    simulations inside giant data centers, add those “drives” to
    their total mile counts, and use them to make claims about
    safety. Simulations might help with some well-defined scenarios
    such as left turns, but they can’t manufacture edge cases. In
    the meantime the companies are relying on pesky humans for help
    navigating higher-order problems. All use remote operators to
    help vehicles that run into trouble, as well as safety
    drivers—“autonomous specialists,” Waymo calls them—who ride
    inside some cars to take over if there’s a problem.
    To Levandowski, who rigged up his first self-driving
    vehicle in 2004, the most advanced driverless-car companies are
    all still running what amount to very sophisticated demos. And
    demos, as he well knows, are misleading by design. “It’s an
    illusion,” he says: For every successful demo, there might be
    dozens of failed ones. And whereas you only need to see a person
    behind the wheel for a few minutes to judge if they can drive or
    not, computers don’t work that way. If a self-driving car
    successfully navigates a route, there’s no guarantee it can do
    so the 20th time, or even the second.

    In 2008, Levandowski kludged together his first self-
    driving Prius, which conducted what the industry widely
    recognizes as the first successful test of an autonomous vehicle
    on public streets. (The event was recorded for posterity on a
    Discovery Channel show called Prototype This!) Levandowski was
    aware of how controlled the environment was: The car was given
    an extremely wide berth as it made its way from downtown San
    Francisco across the Bay Bridge and onto Treasure Island,
    because there was a 16-vehicle motorcade protecting it from
    other cars and vice versa. The car did scrape a wall on its way
    off the bridge, yet he says he couldn’t help but feel amazed
    that it had all basically worked. “You saw that, and you were
    like, ‘OK, it’s a demo and there are a lot of things to work
    on,’ ” he recalls. “But, like, we were almost there. We just
    needed to make it a little better.”
    For most of the years since he built his first “Pribot,”
    Levandowski says, it’s felt as though he and his competitors
    were 90% of the way to full-blown robot cars. Executives he
    later worked with at Google and Uber were all too happy to
    insist that the science was already there, that his prototypes
    could already handle any challenge, that all that was left was
    “going commercial.” They threw around wild claims that
    investors, including the Tesla bull Cathie Wood, built into
    models to calculate that the industry would be worth trillions.
    Once again, this was a bit of self-hypnosis, Levandowski
    says. The demos with the sci-fi computer vision led him and his
    colleagues to believe they and their computers were thinking
    more similarly than they really were. “You see these amazing
    representations of the 3D world, and you think the computer can
    see everything and can understand what’s going to happen next,”
    he says. “But computers are still really dumb.”
    In the view of Levandowski and many of the brightest minds
    in AI, the underlying technology isn’t just a few years’ worth
    of refinements away from a resolution. Autonomous driving, they
    say, needs a fundamental breakthrough that allows computers to
    quickly use humanlike intuition rather than learning solely by
    rote. That is to say, Google engineers might spend the rest of
    their lives puttering around San Francisco and Phoenix without
    showing that their technology is safer than driving the old-
    fashioned way.
    In some ways the self-driving future seemed closest and
    most assured in 2017, after Levandowski went to Uber and Google
    sued them. Google accused Levandowski of taking a work laptop
    home, downloading its contents, and using that information to
    jump-start his work at Uber. (Although he doesn’t deny the
    laptop part, he’s long disputed that its contents found their
    way into anything Uber built.) The lawsuit was destabilizing but
    also validating in a way. Google’s $1.8 billion claim for
    damages suggested it had done the math based on just how
    imminent the fortunes to be made from driverless technology
    were. “People were playing for this trillion-dollar prize of
    automating all transportation,” Levandowski says. “And if you
    think it’s really just a year away, you take the gloves off.”
    Uber had promised to defend Levandowski if he was sued, but
    it fired him in May 2017, and he faced an arbitration claim in
    which Google sought to recoup hundreds of millions of dollars.
    During the 2018 trial, with Google struggling to prove Uber had
    used its trade secrets, the company settled with Uber. It got
    about $250 million in Uber stock, a fraction of what it had
    initially sought, plus a promise that the ride-hailing company
    wouldn’t use Google’s driverless technology.
    The fallout continued for Levandowski in 2019, when federal
    prosecutors announced that a grand jury had indicted him on 33
    counts of trade secrets theft. Soon after, the deal his new
    company, Pronto.ai, had been negotiating with a truck
    manufacturer—to try out Pronto’s more modest driver-assist
    feature for trucks—fell apart. “It turns out a federal
    indictment does cramp your style,” he says. An arbitration panel
    also ordered him to pay Google $179 million. He stepped down as
    Pronto’s chief executive officer, turned the company over to its
    chief safety officer, Robbie Miller, and declared bankruptcy. As
    part of a deal with prosecutors, in exchange for the dismissal
    of the other 32 counts, Levandowski pleaded guilty to one and
    was sentenced to 18 months in federal prison in August 2020.
    Because of the pandemic, the sentence was delayed long enough
    that he never served a day before his pardon, which came on the
    last day of the Trump presidency.
    According to a White House press release at the time, the
    pardon’s advocates included Trump megadonor Peter Thiel and a
    half-dozen Thiel allies, including Arizona Senate candidate
    Blake Masters and Oculus founder Palmer Luckey. Levandowski says
    that he and Thiel have some mutual friends who spoke up for him
    but that they never talked until after the pardon was announced.
    He says he doesn’t know why Thiel took up his cause, but Thiel’s
    antipathy for Google is legendary, and pardoning Levandowski
    would’ve been an opportunity to stick a thumb in the company’s
    eye. Earlier this year, Levandowski reached a settlement with
    Uber and Google over the $179 million judgment that will allow
    him to emerge from bankruptcy.
    The idea that the secret to self-driving was hidden on
    Levandowski’s laptop has come to seem less credible over time. A
    year after Uber fired him, one of its self-driving cars killed a
    pedestrian in Phoenix. (The safety driver was charged with
    negligent homicide and has pleaded not guilty; Uber suspended
    testing its cars on public roads and added additional safety
    measures before resuming testing. The company was never
    charged.) Uber sold its self-driving unit to Aurora, the now-
    struggling upstart, in 2020, when times were better. In
    September, Waymo claimed, based on the results of a simulation,
    that its vehicles are safer in some circumstances than humans.
    Back in the real world, the safety figures are much less
    conclusive, and Waymo is basically where it was five years ago.
    (Waymo disputes this.)

    Levandowski says his skepticism of the industry started
    around 2018. It was a little more than a year after Elon Musk
    unveiled a demo of a Tesla driving itself to the tune of Paint
    It Black. Levandowski checked the official road-test data that
    Tesla submitted to California regulators. The figures showed
    that, in that time, the number of autonomous miles Tesla had
    driven on public roads in the state totaled—wait for it—zero.
    (Tesla hasn’t reported any autonomous miles traveled in
    California since 2019. The company didn’t respond to a request
    for comment.) Although Levandowski says he admires Tesla, is
    impressed by its driver-assistance technology, and believes it
    may one day produce a truly self-driving car, he says the lack
    of progress by Musk and his peers forced him to question the
    point of his own years in the field. “Why are we driving around,
    testing technology and creating additional risks, without
    actually delivering anything of value?” he asks.
    While Tesla has argued that its current system represents a
    working prototype, Musk has continued to blur the lines between
    demos and reality. On Sept. 30 he unveiled what looked like a
    barely functional robot, promising it would unleash “a
    fundamental transformation of civilization as we know it.” Six
    years after it began selling “full self-driving” capabilities,
    Tesla has yet to deliver a driverless car. Levandowski, for his
    part, has been spending time in gravel pits.
    For more than 100 years, mining companies have been
    blasting rocks out of the hills near Santa Rosa, Calif., and
    crushing them into gravel bound for driveways, roads, and
    drains. Levandowski sometimes refers to Mark West Quarry, where
    Pronto has been operating its driverless trucks since last
    December, as a “sandbox,” and it’s easy to see why. The dusty
    mine features life-size versions of the Tonka toys you’d find in
    a child’s playroom. Yellow excavators knock enormous boulders
    down from a terraced cliffside into the mining pit, where front-
    end loaders pick up the stones and place them in 50-ton dump
    trucks to be carried to the crusher. “An 8-year-old boy’s
    dream,” Levandowski says as the boulders rattle through the
    crusher, which spits the smaller pieces out onto piles.
    The mine work started as a sort of backup plan—a way to
    bring in revenue while Pronto got trucking companies comfortable
    with using its driver-assistance technology in their long-haul
    semis. Now, Levandowski says, construction sites are Plan A.
    Pronto took the same basic system it had used on the semis and
    built it into a self-driving dump truck, adding cameras, radar,
    and an onboard computer. Because connectivity is spotty at mine
    sites, the company created its own networking technology, which
    it spun off as a separate company, Pollen Mobile LLC. “With
    mining we’re doing driverless, but controlling the environment,”
    says Pronto Chief Technology Officer Cat Culkin. BoDean Co., the
    company that owns Mark West Quarry, is one of a half-dozen
    clients that pay installation fees to retrofit dump trucks with
    sensors, plus hourly fees for use. Neither Levandowski nor
    BoDean will say how much Pronto charges or how much it’s taking
    in.
    Here’s his new vision of the self-driving future: For nine-
    ish hours each day, two modified Bell articulated end-dumps take
    turns driving the 200 yards from the pit to the crusher. The
    road is rutted, steep, narrow, requiring the trucks to nearly
    scrape the cliff wall as they rattle down the roller-coaster-
    like grade. But it’s the same exact trip every time, with no
    edge cases—no rush hour, no school crossings, no daredevil
    scooter drivers—and instead of executing an awkward multipoint
    turn before dumping their loads, the robot trucks back up the
    hill in reverse, speeding each truck’s reloading. Anthony Boyle,
    BoDean’s director of production, says the Pronto trucks save
    four to five hours of labor a day, freeing up drivers to take
    over loaders and excavators. Otherwise, he says, nothing has
    changed. “It’s just yellow equipment doing its thing, and you
    stay out of its way.”
    Levandowski recognizes that making rock quarries a little
    more efficient is a bit of a comedown from his dreams of giant
    fleets of robotic cars. His company plans to start selling its
    software for long-haul trucks in 2023. And hopefully, in a few
    decades, all his old boasts will come true: driverless cities
    with cushy commutes, zero road fatalities, and totally safe road
    naps. But for now: “I want to do something that’s real, even if
    that means scaling back the grandiose visions.”

    To contact the author of this story:
    Max Chafkin in New York at mchafkin@bloomberg.net

  14. #14
    Join Date
    Jul 2008
    Location
    NY & MN
    Posts
    5,436
    Post Thanks / Like
    Mentioned
    11 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    Quote Originally Posted by vertical_doug View Post
    For most of the years since he built his first “Pribot,”
    Levandowski says, it’s felt as though he and his competitors
    were 90% of the way to full-blown robot cars. Executives he
    later worked with at Google and Uber were all too happy to
    insist that the science was already there, that his prototypes
    could already handle any challenge, that all that was left was
    “going commercial.” They threw around wild claims that
    investors, including the Tesla bull Cathie Wood, built into
    models to calculate that the industry would be worth trillions.
    Once again, this was a bit of self-hypnosis, Levandowski
    says.
    Like crypto, if money is sufficiently easy, just about any hair brained scheme can look viable.

    Hopefully with interest rates returning to historical norms, the oxygen tank will gradually get turned off on the fake it 'til you make it crowd.

  15. #15
    Join Date
    Oct 2012
    Posts
    11,036
    Post Thanks / Like
    Mentioned
    12 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    Quote Originally Posted by dgaddis View Post
    Just to be clear, I wasn't saying she deserved it, or trying to lessen the value of her life. Just because she made bad choices doesn't mean she should have been killed for it. I didn't mention the toxicology report because whatever it had said, I still feel like she shouldn't have been in the road. Or at the very least she shouldn't have assumed traffic would slow to let her across. I say she's 'primarily' to blame because she was acting outside of the expected norms, and in the video it didn't look like she ever even looked to see if traffic was coming.

    But you're right, it was the combination of several mistakes. The driver, and Uber, and the officials that allowed the car on the roads are certainly partially to blame for her death too.

    If she didn't walk in front of the car, she wouldn't have been hit.
    If the car had braked, maybe it wouldn't have hit her.
    Had the driver been paying attention to the road, maybe she wouldn't have been hit.
    I understand it happened in a urban or city area?

    The problem is not the pedestrian, the autonomous system or the driver. The problem is that cars have no business going at a speed that do not allow them to stop in less than 5 meters in a city pedestrian place. Highways are to interconnect cities, not to cross them. Any pedestrian should be allowed and able to cross a street between two slow moving vehicles the way we used to do in the early 20th century. Jaywalking has never been a real problem but an invention of the car industry.


    Barely overexagerated illustration of the issue:
    https://twitter.com/the_transit_guy/...68092256923648

    No amount of technology or driver education will ever solve those problems.
    Last edited by sk_tle; 10-07-2022 at 07:49 AM.
    --
    T h o m a s

  16. #16
    Join Date
    Apr 2011
    Location
    Hillsdale NY
    Posts
    25,532
    Post Thanks / Like
    Mentioned
    74 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    The road where the accident occurred in Phoenix is a divided road with a planted central median and two to four lanes on either side, depending on how close to an intersection. In Europe it would be an autobahn with stoplight intersections. Crashes at intersections on these sorts of streets can be spectacular. I arrived at one just after a pickup truck broke a Honda Accord in half and then crashed into a quickee mart. The speed limits often seem irrelevant.
    Last edited by j44ke; 10-10-2022 at 04:22 PM.

  17. #17
    Join Date
    Jul 2008
    Location
    NY & MN
    Posts
    5,436
    Post Thanks / Like
    Mentioned
    11 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    Quote Originally Posted by j44ke View Post
    The road where the accident occurred in Phoenix is a divided road with a planted central median and two to four lanes on either side, depending on how close to an intersection. In Europe it would be an autobahn with stoplight intersections. Crashes at intersections on these sorts of streets can be spectacular.
    It is a uniquely American species of stupidity to have high traffic four lane roads with cars often going 60mph+, at grade, with lights, and right turns on red.

  18. #18
    Join Date
    Jun 2008
    Location
    The Pacific Midwest
    Posts
    7,923
    Post Thanks / Like
    Mentioned
    16 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    Look at the variety of sponsors on the University of Michigan’s Mcity’s website for example…Toyota, HONDA, Ford, State Farm, Verizon, Deloitte to name a few. CAV’s attract strong interest from researches and investors alike and are tied into the subject of “Smart Cities”. The relatively recent surge in research, fabrication and test deployment of urban vehicles, was a seed started many years ago and government support behind the subject matter has strong ties to the military as well. I’m not saying this in a negative manner, but the link is pretty clear when you start reading more about where the industry is headed.

    https://mcity.umich.edu/

    https://mcity.umich.edu/did-you-know...-mcity-opened/

    https://mcity.umich.edu/our-vision/fast-facts/
    rw saunders
    hey, how lucky can one man get.

  19. #19
    Join Date
    Sep 2010
    Posts
    4,836
    Post Thanks / Like
    Mentioned
    16 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    Quote Originally Posted by rwsaunders View Post
    Look at the variety of sponsors on the University of Michigan’s Mcity’s website for example…Toyota, HONDA, Ford, State Farm, Verizon, Deloitte to name a few. CAV’s attract strong interest from researches and investors alike and are tied into the subject of “Smart Cities”. The relatively recent surge in research, fabrication and test deployment of urban vehicles, was a seed started many years ago and government support behind the subject matter has strong ties to the military as well. I’m not saying this in a negative manner, but the link is pretty clear when you start reading more about where the industry is headed.

    https://mcity.umich.edu/

    https://mcity.umich.edu/did-you-know...-mcity-opened/

    https://mcity.umich.edu/our-vision/fast-facts/
    Only in America is the real world condition have a stop sign shot up with buck shot.

  20. #20
    Join Date
    Jun 2008
    Location
    The Pacific Midwest
    Posts
    7,923
    Post Thanks / Like
    Mentioned
    16 Post(s)
    Tagged
    0 Thread(s)

    Default Re: Autonomous Vehicles

    Quote Originally Posted by vertical_doug View Post
    Only in America is the real world condition have a stop sign shot up with buck shot.
    Wasn't that awesome? I liked the skater boy with the tossle cap pulled over his/her eyes and of course, the buck.
    rw saunders
    hey, how lucky can one man get.

Page 1 of 2 12 LastLast

Similar Threads

  1. Replies: 7
    Last Post: 04-05-2013, 08:13 AM

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •