text
stringlengths
0
2.35k
[1843.78 --> 1846.20] And so Go standing there alone, like, uh-oh.
[1846.74 --> 1850.92] So yeah, I mean, when the AI like TensorFlow has got it in for you.
[1851.32 --> 1859.30] So if only they had invested the time to support their own products, it would have been amazing.
[1859.30 --> 1860.96] We probably would have avoided all that.
[1861.18 --> 1861.36] Okay.
[1861.50 --> 1863.06] So that's the lesson for us then.
[1863.50 --> 1869.92] Did Copilot help at all with TensorFlow or because it was never trained on Go, it had not enough even something to start with?
[1870.30 --> 1871.18] I'm frightened to ask.
[1871.34 --> 1871.76] That's fair.
[1871.94 --> 1872.68] I don't want to get fired.
[1872.96 --> 1874.22] Remember, Copilot is my manager.
[1874.22 --> 1881.02] Basically, Copilot is your manager because that's the only one who's able to understand even a little bit of your code of your Go.
[1881.14 --> 1881.62] Is this why?
[1882.12 --> 1896.14] Well, what I was told by Copilot was, first of all, it said that since I'm the last living human Go programmer, that I'm not sure if it's some sort of government program or something, but they have to provide me employment.
[1896.76 --> 1901.30] Maybe it's they have to keep a human in the loop just for, like, ritual purposes.
[1901.64 --> 1902.76] I'm not exactly sure.
[1902.76 --> 1906.10] It tried to explain it to me, but I couldn't understand the math.
[1906.32 --> 1907.18] And that's what it said.
[1907.60 --> 1908.96] You wouldn't understand the math.
[1909.12 --> 1910.36] And I just sort of accepted that.
[1910.66 --> 1912.32] Was it something with the word taxes?
[1912.76 --> 1913.60] Is that still a concept?
[1914.18 --> 1915.40] No, there's no taxes in the future.
[1915.84 --> 1916.04] Oh.
[1916.36 --> 1917.36] Things that drive governments.
[1917.66 --> 1918.34] There's no money.
[1919.28 --> 1920.32] There's just canned tuna.
[1920.46 --> 1921.34] Oh, I thought there was money.
[1921.42 --> 1922.22] There was money earlier.
[1922.50 --> 1923.12] Is that canon?
[1924.10 --> 1925.98] Oh, when I used to get my blood transfusions.
[1926.10 --> 1926.54] Yeah, that's right.
[1926.54 --> 1926.98] Oh, yeah.
[1927.10 --> 1928.54] No, that doesn't count.
[1928.92 --> 1930.04] That's just Git points.
[1930.78 --> 1931.54] Git stars.
[1931.54 --> 1933.54] I just trade those when I need some fresh blood.
[1933.88 --> 1934.56] Yeah, okay, fine.
[1934.80 --> 1937.16] What's the ratio of stack points to Git points?
[1937.82 --> 1939.36] You know, that changes moment to moment.
[1939.50 --> 1941.12] Some people's whole living is off of that.
[1941.62 --> 1942.98] Oh, those COBOL developers.
[1943.40 --> 1947.68] The bots trading goes on so quickly that, you know, I don't really know.
[1947.68 --> 1949.02] I'll tell you what.
[1949.24 --> 1951.20] Bartek Plotka on Twitter.
[1951.42 --> 1962.64] He was saying that he wants the sweet max heap option for the garbage collector and a YOLO rust-like memory ownership for critical portions of your program that works on the same heap.
[1962.90 --> 1963.44] Oh, memory.
[1963.58 --> 1963.92] Memory.
[1964.14 --> 1964.44] Memory.
[1964.70 --> 1964.88] What?
[1965.22 --> 1965.40] Sorry.
[1965.52 --> 1965.92] Yeah, memory.
[1966.16 --> 1966.56] Oh, right.
[1966.72 --> 1967.00] Right.
[1967.06 --> 1967.38] Memory.
[1967.60 --> 1968.02] You remember.
[1968.48 --> 1969.58] Oh, memory.
[1969.58 --> 1970.06] Memory.
[1970.06 --> 1971.12] I remember it well.
[1971.26 --> 1973.62] Those sweet, solid days of memory.
[1973.84 --> 1976.68] You know, you would store a one and then you would get back a one.
[1976.86 --> 1977.06] Yeah.
[1977.32 --> 1978.48] It was so good.
[1978.54 --> 1979.20] Oh, that is good.
[1979.32 --> 1980.22] It was so sweet.
[1980.22 --> 1985.52] Now with these quantum semi-positions, like you never really know, you know, are you hot?
[1985.60 --> 1986.26] Are you cold?
[1987.00 --> 1988.02] You're nine days old.
[1988.14 --> 1990.02] You don't really, really, you just don't know anymore.
[1990.24 --> 2000.74] But being able to create safe software, safe software that was able to run like really mission-critical things like the things that were inside of airplanes and cars and healthcare systems.
[2000.74 --> 2006.72] This was a place where Go could have really shined because it had a lot of memory safety and it could have gone even further.
[2007.26 --> 2013.76] You know, it could have been a contender in this world of whatever the ISO standard back in those days for human safety.
[2013.92 --> 2018.84] I mean, nowadays, human safety is, you know, not that important, but it's robot safety, most important thing.
[2019.08 --> 2026.30] But back then when humans were being protected by other humans, occasionally, Go really could have been the language if only they had said,
[2026.30 --> 2033.72] we need to focus on making a language that's safe enough to use in these kinds of embedded and mission-critical systems.
[2034.20 --> 2035.16] That would have been great.
[2035.40 --> 2035.50] Yeah.
[2035.72 --> 2037.52] You know, you talk about those quantum variables.
[2037.66 --> 2045.42] I genuinely did see some code once where somebody set a value in the code and then underneath they set it again just to make sure.
[2045.78 --> 2049.16] That was genuinely what they'd written, which I thought was just amazing.
[2049.16 --> 2055.32] I think we've had some nights when we were at the cocktail bar where we couldn't tell true from false, Matt, back in those days.
[2055.32 --> 2056.62] But yeah, that can happen.
[2056.98 --> 2058.20] No, it doesn't really matter.
[2058.56 --> 2059.20] It's all true.
[2059.36 --> 2060.12] It's all false.
[2060.70 --> 2062.70] Let the quantum processors decide.
[2063.42 --> 2069.68] Is it because all the memory units are more sensitive to cosmic radiation now that there's no ozone?
[2070.54 --> 2075.44] Well, also, you know, when you're building something that's got to survive a two-year trip to Mars,
[2075.66 --> 2080.94] believe me, your MP3s sound pretty funny by the time the ship gets to its destination.
[2081.54 --> 2082.32] Or so I've been told.
[2082.70 --> 2083.26] I don't know.
[2083.26 --> 2085.52] Actually, those might be AIs sending back those reports.
[2085.64 --> 2087.48] There might be no humans that survived the trip.
[2087.98 --> 2089.36] There's a rumor going around.
[2089.82 --> 2090.60] They're all just AIs.
[2090.74 --> 2091.56] How's it going around?
[2091.98 --> 2092.84] Who's it going around?
[2093.36 --> 2095.88] Social media still exists in 2053.
[2096.12 --> 2096.90] Oh, thank goodness.
[2098.28 --> 2099.68] I don't know what I'd do without it.