text stringlengths 14 502 |
|---|
[911.20 --> 911.46] But still. |
[913.20 --> 916.16] His wife, Jenna, is still around. |
[916.16 --> 921.72] And has stated that she's going to continue the co-optional podcast as well as her own YouTube channel. |
[921.94 --> 925.90] So if you want more of that flavor of content, be sure to tune in. |
[926.28 --> 930.70] And unless you have something to say, I don't think I'm going to keep going. |
[931.86 --> 934.70] If you want, check him out. |
[934.96 --> 936.22] The YouTube channel is still there. |
[936.28 --> 936.92] His Twitter is still there. |
[936.96 --> 940.60] You can still watch all his old reviews of indie games if you want to check different things out. |
[940.60 --> 947.14] And if you did that, if you consume that content today, you'd still be supporting those who survived him. |
[947.54 --> 949.50] Like his wife, Jenna, for example. |
[949.82 --> 950.78] Check out her YouTube channel. |
[950.90 --> 952.36] Keep watching co-optional podcasts. |
[952.54 --> 953.40] Do all that kind of stuff. |
[954.76 --> 954.96] Yeah. |
[955.06 --> 955.78] Rest in peace, John. |
[956.16 --> 956.98] Rest in peace, John. |
[958.08 --> 959.14] And moving on. |
[959.36 --> 960.08] Because I don't want to. |
[960.08 --> 961.12] No possible segue. |
[961.96 --> 962.70] Literally not. |
[962.70 --> 964.30] There's no possible segue now. |
[964.42 --> 964.62] No. |
[964.62 --> 968.52] So here's a story that we reported on when it first broke in March. |
[969.22 --> 977.24] The accident that happened in Arizona with a self-driving Uber that, again, this is actually also kind of dark. |
[977.70 --> 981.52] It was the first pedestrian death in a self-driving car case. |
[982.38 --> 986.98] So there's been a report published by the National Transportation Safety Board. |
[987.26 --> 994.30] It has all the details of what happened, what parts of the car's systems failed, the human error, the software. |
[994.30 --> 996.10] All of it we know now, finally. |
[997.98 --> 1003.22] So there's really like a play-by-play in the report, which you'll be able to click through. |
[1003.40 --> 1005.76] It's source number three if you get access to the WAN doc. |
[1005.88 --> 1011.88] Or if you go to the article that Luke has on the screen, you'll be able to click through as well from the first paragraph. |
[1012.04 --> 1014.54] The WAN doc will be available on the forum after the show, by the way. |
[1014.80 --> 1018.38] And this was posted on the forum, by the way, by Spartaman64. |
[1018.62 --> 1019.04] Thank you. |
[1019.04 --> 1028.16] So for those of you who don't know, there was a self-driving Uber cruising down a, I believe it's, there's two lanes. |
[1028.70 --> 1029.64] It's at night. |
[1029.72 --> 1032.64] There's two left-turning lanes, and then there's a bike lane on the right. |
[1033.08 --> 1040.24] And basically it's nighttime, and a woman crosses the street, not at a crosswalk, which is pretty much irrelevant, walking her bicycle. |
[1040.86 --> 1044.94] And the car strikes the woman, and she eventually died because of her injuries. |
[1044.94 --> 1052.46] There's also dash cam footage that was released that shows what the car is heading towards, and also inside the car. |
[1052.66 --> 1058.96] And the view inside of the car, you can see an Uber employee who's supposed to be responsible for taking over in tricky circumstances, |
[1058.96 --> 1065.90] who appears to be looking down possibly at a smartphone, and no one knew if this person was being negligent or what. |
[1066.90 --> 1068.18] But now we have more details. |
[1068.18 --> 1072.40] So this is just crazy. |
[1072.58 --> 1080.42] So apparently what happened was shortly after the story broke, we learned that there was a statement from the producers of the LIDAR systems |
[1080.42 --> 1085.08] that they believe that they had checked and that their systems had performed properly. |
[1085.52 --> 1087.78] And the report confirms that. |
[1087.78 --> 1093.80] So what happened was, as the vehicle approached, I believe her name was Elaine, |
[1094.74 --> 1097.54] at first the vehicle detected just an object. |
[1097.96 --> 1100.62] This is at six seconds away from collision. |
[1101.06 --> 1104.08] Then as it gets closer, it's decided, okay, that's a vehicle. |
[1104.24 --> 1105.70] And then finally, that's a bicycle. |
[1106.68 --> 1107.70] But here's the problem. |
[1110.10 --> 1112.36] Hold on. |
[1112.36 --> 1120.08] At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision. |
[1121.10 --> 1127.20] According to Uber, emergency braking maneuvers are not enabled while the car is under computer control |
[1127.20 --> 1130.92] to reduce the potential for an erratic vehicle behavior. |
[1130.92 --> 1133.54] So the car sees that, oh, this is going to be crazy. |
[1134.52 --> 1138.42] It's going to need some tricky steering for us to avoid this. |
[1138.54 --> 1139.60] I'm not allowed to do that. |
[1139.66 --> 1140.60] The driver has to do that. |
[1140.60 --> 1143.28] So I should probably notify the driver. |
[1143.90 --> 1146.28] There are no alerts set up. |
[1146.90 --> 1149.54] The system is not designed to alert the operator. |
[1149.90 --> 1151.18] Wait, it gets worse. |
[1151.62 --> 1154.38] So then you're thinking, okay, but if there's someone in the driver's seat... |
[1154.38 --> 1158.12] I was just going to say, this is still a problem because the car shouldn't have to alert the driver. |
[1158.64 --> 1159.92] They should be paying attention. |
[1160.10 --> 1162.20] Well, it still should alert the driver because you could be... |
[1162.20 --> 1163.36] Yes, but it shouldn't have to. |
[1163.92 --> 1164.62] It shouldn't have to. |
[1164.76 --> 1167.02] It should, like, okay, no, it should have to, sorry. |
[1167.40 --> 1170.34] But the problem should have still been solved. |
[1170.60 --> 1170.78] Right. |
[1170.88 --> 1175.16] So presumably the person sitting in the driver's seat, they're saying, they're thinking, oh, this looks weird. |
[1175.20 --> 1175.96] I should take over. |
[1176.16 --> 1176.32] Yeah. |
[1176.32 --> 1181.98] But you would still like to see, like, you might be thinking, is this a situation where the car is going to do it or not? |
[1182.10 --> 1183.76] It'd be nice if the car was like, bang, bang, bang. |
[1183.88 --> 1185.52] And then you're like, yeah, okay, I'm going to take over. |
[1185.84 --> 1196.40] But it gets worse because the driver is actually responsible for monitoring diagnostic messages that appear on an interface in the center stack of the vehicle dash. |
[1196.40 --> 1196.84] Oh. |
[1196.84 --> 1199.22] And tagging events of interest for subsequent review. |
[1199.40 --> 1200.82] That's what she's looking at. |
[1200.96 --> 1201.40] Oh. |
[1201.54 --> 1203.18] She's looking down, not at her phone. |
[1203.44 --> 1205.56] I mean, this is as per her statement. |
[1205.68 --> 1208.40] She did have a personal and a business phone in the car. |
[1208.40 --> 1213.36] But she claims she was actually just looking at the interface that she is supposed to be looking at. |
[1213.52 --> 1213.62] Yeah. |
[1213.62 --> 1219.90] So now you've got a situation where Uber's paying drivers to sit in the car and look at it and not look at the road, basically. |
[1220.56 --> 1223.40] And then the car is not going to go ding, ding, ding, look up. |
[1223.94 --> 1228.82] And then it plows into a pedestrian causing that person's death. |
[1229.40 --> 1229.68] Jeez. |
[1229.84 --> 1232.74] Like, you should have seen that coming. |
[1233.10 --> 1234.86] That is silly design. |
[1234.86 --> 1235.44] That is silly. |
[1235.44 --> 1236.24] That's like negligent. |
[1236.28 --> 1237.32] That's a bad system. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.