The Elixir Outlaws now have a Patreon. If you’re enjoying the show then please consider throwing a few bucks our way to help us pay for the costs for the show.
Episode Transcript
Amos: Welcome to Elixir Outlaws the hallway track of the Elixir community.
Anna: Nice t-shirt Keathley.
Chris: Thanks.
Amos: What is your t-shirt?
Chris: It's, uh, my t-shirt is from Strange Loop. One year Carbon 5 made custom shirts for Strange Loop.
Anna: Oh yeah, that's the year I went to Strange Loop. That was fun.
Chris: That was a good year.
Anna: That was a good year at Strange Loop. That was super fun. It's a big C and a 5 from that year.
Amos: It's back in person this year.
Anna: Oh yeah? Are you going?
Amos: I don't know. Trying to debate. I don't know. I didn't get a ticket yet, so it's probably already sold out. I don't know.
Chris: Yeah. I think it's already sold out.
Anna: I do love Strange Loop.
Chris: Yeah. I have always really enjoyed that, that conference. I am not going this year. I didn't buy a ticket.
Anna: I didn't either.
Chris: Reentry is hard y'all.
Anna: It is totally hard.
Chris: Yeah. And then there's like Delta variants and blah, blah, blah. It's like, I don't even know. We don't even talk about this, but it's just, yeah. Reentry is hard. Suffice to say. Reentry is hard.
Amos: We have an in-person Elixir meetup tonight. We had one recently too, but it was like outdoor at a park, everybody's standing far apart talking. So we're, we're actually going to do it at my office today. So that'll be.
Chris: Nice. That'll be fun.
Amos: It's a good entry level reentry. I cannot imagine like Strange Loop level being like my first outing.
Anna: Oh god no! Yeah. That would be crazy.
Chris: I might implode a little bit. Yeah, no.
Amos: As extroverted as I am, um.
Chris: I ain't about that, naw.
Amos: This whole living at home has made me a little more introverted and um, for everybody who's introverted out there, I now understand you a little better and I'm so sorry.
Anna: About what? About how the world is?
Amos: So sorry about how I have, I probably stress people who are very introverted out is my guess.
Chris: You stress me out!
Amos: Cause I'm like-
Anna: Cause you're like introverted Keathley?
Chris: Yeah, highly introverted. No, no, my actual persona is not what is not, what is on display in front of you today. My actual persona is highly introverted.
Anna: Oh really?
Chris: Yeah, yeah, yeah.
Anna: I did not know that about you.
Chris: No, no. Yeah. Highly, highly introverted. Um, it's the, I don't know, assuming we're using the definition of introverted that I think people tend to like, which is the one that's like, where do you feel the most recharged? It's like, not by being around people. A hundred percent. That is draining. I got to like gear up for that. Yeah.
Anna: That's fair.
Amos: I'm a hundred percent the opposite.
Chris: Yeah. So you're weird. You're the problem.
Anna: He's just extroverted. What? Keathley!
Amos: I do need to be alone once in a while. Like to be clear, I need to be alone once in a while. I need to be alone with my thoughts. I need to walk away from people, but if I really want to get motivated and really, like, want to get excited, I go be around a lot of people.
Chris: I went out with drinks with friends.
Anna: You went out with drinks?
Chris: I went out for drinks.
Amos: There happened to be friends around.
Chris: With friends. I went out with my drinks the other night as you do.
Anna: As you do, you know.
Chris: I, no, I went out for drinks with some friends the other night, which was just, and it was just very different. It was like, we're all, we're all, I mean, let's just weird, right? Like it's like you've spent a year not seeing human beings in the flesh. And then you are now at a, you know, at a restaurant or a bar or whatever. And you know, you're, I don't know. It's just, it's just different, you know, like the no mask thing is different. Like all of it, it's just weird.
Anna: It is weird. It's weird.
Chris: And we're all, you know, vaxxed and got our, you know, all that sort of stuff and like have been for a long while. So it's all like, you know, we're, none of us are being, I mean, any more or less risky than anything else, right? It is. It's it doesn't feel that way in your head.
Anna: Well and I still feel like it’s a little bit unclear with like stuff going out around with like SF, for example, basically has herd immunity, right? Like I think over 80% of people have at least one shot, but some of the people I follow on Twitter, like this guy who's chair of medicine at UCSF was saying like, he's now thinking about starting to mask indoors again, just because of the Delta variant stuff.
Chris: Yeah.
Amos: Is it that bad? I haven't heard a lot about it, I mean I've heard a little bit.
Chris: It’s bad.
Anna: It's bad.
Chris: It's really contagious.
Anna: Like way more contagious
Chris: If you can believe that.
Amos: Super awesome.
Anna: I mean, it's like, and for people who are fully vaxxed it, you should be like, you will like, I mean, you, it's less likely you're more likely to be okay if you get sick, it's more likely to be mild. It's the people that only have one dose of the vaccine that are supposedly most like, can get really sick still. And that's, you know, and obviously if you don't have, if you aren't vaccinated at all, but it's super contagious and they still don't know exactly how it's spreading and affecting people who are vaccinated. So, we're not quite as it was.
Chris: Yeah. The most recent thing I saw was like so far it's doesn't look any more severe if you get it, it looks, it looks basically it has the same patterns that it's always had in terms of severity. Um, but, uh, it is way more contagious.
Amos: This is not, this is not good. Cause for me two weeks after my second shot, I was outside with my mask in my hand and not on my face, like the Sound of Music spinning around, I was like," Yes!"
Chris: Yeah, I know right? So I mean, it is, it is like overwhelmingly the people who are needing to like go to the doctor or like be in the hospital for, for the virus are people who aren't vaccinated at all, who haven't had a single shot. So it's like overwhelmingly like so far, all the data is suggesting that like, if you're vaccinated, you've got your two shots, you've got your two, you know, two weeks and you're like post all that stuff that it seems to highly effective against the Delta variant.
Anna: Yeah. It's highly unlikely. Not impossible, but it's highly unlikely.
Chris: Right? Exactly. Yeah. So, so far it's, it seems, you know, maybe okay. But yeah, like California is talking about masking indoors again and that sort of stuff, even if you're vaccinated and stuff like that. So, it remains to be seen.
Anna: We're not quite out of the woods yet. Yeah.
Chris: Yeah. So it's a weird, weird time to list, you know.
Amos: We are not giving medical advice.
Anna: No, we're not.
Amos: I've been reading a lot of contracts and law stuff lately, so-
Chris: No, but you should get vaccinated.
Amos: I just want to remind people, that we're not giving medical advice.
Anna: You should get vaccinated, but we are not giving medical advice.
Chris: This is not medical advice, I not a lawyer nor a doctor.
Amos: We are not doctors.
Anna: No, we are not doctors or lawyers.
Chris: But get vaccinated.
Anna: But yes, get vaccinated.
Amos: Perfect.
Anna: For just be a good citizen.
Chris: Also, you can't listen to this program. If you're anti vaccinations, I ban you.
Anna: I think you've said this before Keathley.
Chris: It'll work. They'll self select out.
Anna: They have already self selected out. I'm just saying. Anyway, what else is going on y'all? I haven't talked to y'all in a while. It's been a minute.
Chris: I think I might go to Elixir Conf.
Anna: Oh yeah. Where is Elixir Conf this year?
Amos: Austin.
Chris: It's in Austin.
Amos: In the original Lone Star-
Chris: It’s at the Norris Center, which is everybody's favorite venue.
Anna: Like isn't that like outside Austin? Is that in Austin?
Amos: It's like suburb suburbian, Austin.
Chris: Its, uh, sure, in the way that I live in Chattanooga, which is to say I don't because I live outside the city limits, but you know.
Anna: But you're gonna go to Elixir Conf.
Chris: Yeah, I think I want to, yeah.
Anna: When is it again? Is it still August?
Chris: It's in August sometime, I think. I don't even know the dates. I might submit a talk, but that feels like a lot of work.
Anna: Yeah. I don't think I'm doing any speaking for the remainder of the year.
Chris: I, that's kinda how I feel. I think I'm taking a break for the rest of the year from speaking. I gave a talk, at, uh.
Amos: I don't feel motivated enough to put together a talk.
Chris: I gave a talk at a Meetup. A few nights ago.
Amos: Oh yeah! In DC.
Chris: Yeah. Right. Uh, yes. Yeah. DC Elixir. That was super fun. Sundi asked me to come and talk about blog posts that I wrote.
Amos: Friend of the Show.
Chris: And that was super fun and was a nice sort of like, oh yeah, MeetUp, MeetUp talks are fun. Like, you know, very low, like much lower risk, like lower, lower effort required it's like more sort of conversational. So that was cool. That was, that was fun.
Anna: That's awesome. So Keathley, I want to talk about your strong feelings about Copilot.
Chris: Oh, you're talking about Copilot. Oh yeah. That thing can, can go to hell. Uh, yeah.
Anna: I saw your Twitter.
Amos: Yeah, we, we created a company policy that nobody's allowed to use it.
Chris: It's just frustrating. So like, here's the thing. There's a really, really good article.
Anna: I love it when Keathley starts sentences with "Here's the thing."
Chris: Here's the thing. Listen. I've given this a lot of thought.
Amos: Copilot is not real. It’s not a real thing.
Anna: That's my other favorite Keathley saying.
Chris: No, what's really true is that GitHub got in a hammock someone, some, some machine learning boy at GitHub got in a hammock and was like, I've given it a lot of thought and I'm pretty sure that software licenses aren't a thing. And then just decided to move forward because machine learning, machine learning is really like Factorio. Like machine learning requires data for the machine learning like, um, and much like Factorio. It takes a lot of GPU's to be able to run it correctly, like quickly. But I don't know. Like, do you want to explain what we're talking about? You wanna explain what Copilot is? Do you think there's a single person out there who doesn't know a Copilot is in this point?
Amos: We should explain it.
Chris: Explain it. Explain it, Amos.
Amos: Me?
Chris: Yes.
Anna: Sure.
Amos: Uh, Copilot is your AI pair programmer and it, the AI was trained by going through open-source software. A lot of it is GPL too. So, um.
Chris: And we're gonna get into that, but just, just without your bias, what is it?
Amos: It’s like an auto-complete for your code. So you type in something like, uh, add as a function name and it tries to fill in the function with exactly what you need in lines of code. And apparently is pretty scary whenever you start typing. And it knows exactly what your function that you wanted with just the words, including.
Anna: But like I don't, I mean, yes, go ahead, sorry.
Amos: Including like, like writing up all the code that you need to call out to external APIs.
Anna: See, I don't even like it when my email tries to auto complete, even when it's doing the right thing. Like, I don't even like that. What Google is like, you know, look forward to hearing from you. I'm like, don't tell me what I'm going to say. I'm just going to say it. I hate that.
Amos: I love it.
Anna: So I definitely don't want something auto completing my code.
Chris: It’s kind of, its cute right?
Amos: I always tab it in and let it like complete it. And then I go back and change like little words around, but it knows me so well, it's really crazy.
Chris: It’s very cute. The whole AI is very cute and all the examples are very cute. Like you can type fizz buzz and it'll fill out fizzbuzz for you in any language. You can type a, you know, Fibonacci and it'll write recursive Fibonacci algorithms for you and crap like that. But it's not magic. It's a fricking neural net. That's all it is. It's, it's GPT three fed with millions and millions and millions of lines of source code. And it's important to note, which I don't think you stated explicitly, but it's important to note that GitHub wrote this thing in conjunction with Microsoft people. So like that is an important thing to keep in mind. Uh, and it's important because.
Anna: Well GitHub is now Microsoft, yes?
Chris: Yes, absolutely. But I, I just, I think they've, they've attempted to, to keep those brands very separated. Like they don't want you to know that GitHub is owned by Microsoft. Um, in a lot of ways, like they really try to act like, GitHub is, not owned by Microsoft. And so like it's, it's, it's wholly own thing.
Amos: They are a wholly owned subsidiary.
Chris: Yeah. But like, that's not clearly not the case anymore, but I dunno, my, the whole thing is, so it's like such a cute toy, little, like, I don't know, like I don't get it. I don't get why this is great. And I guess it's, cause I'm not some Hacker News orange shirt wearing startup bro, who needs to be crushing it all the time, and just, just only concerned with shipping or some crap like that. I'm like, this is not the thing that I needed in my life, but I'm also the **** who still uses VIM on a daily basis. And like, doesn't use any sort of completion anything. So it's like, I guess I'm the one doing it wrong, but I don't know.
Amos: It's not, it's not the same as like an auto-complete of functions that are within your own application.
Chris: No, I mean, it’s code, right.
Amos: And the fact that it was, yeah, legally, it might be fine that it uses the GPL. Like if you go read the GPL license, but I'm pretty sure that anybody, like, I think this, you might've said this too, is anybody who is, who has put out code under GPL, did not anticipate the use case. And I, and I understand like you put out something for free, you never know who's going to use it or how they're going to use it. And it's really hard to license against certain uses, right? Like somebody can use it for bad things, but I, so we made it against the rule. Uh, like we made a policy, like you cannot use Copilot at Binary Noggin on any of our code because I also don't want GPL code ending up in my application by accident, which you could, unless you're going to go back and check every autocomplete that it does. And who's realistically going to do that. And yes, they say it's like, 0.1% of the time that it might give you, but still like you are building on the back of an intellectual property that was licensed in such a way that if you built on the back of that intellectual property, then you had to give back too. If you're using Copilot, as far as I'm concerned, the spirit of the, of the GPL, not necessarily what the GPL says.
Chris: Yeah. I mean, so I think that's the thing. All right. All right. So let's, let's be clear. Let's just, uh, state the world real quick. So Copilot is a neural net trained, its trained on whatever subset of GitHub's, uh, repos that they deemed were permissible, which is to say anything that was public. The most, the funniest thing about this whole thing about that whole thing is that there's like a part of the FAQ, which is like, what do we train this on? And they basically said something like, well, the ML community has decided that anything that we can get is totally fair use. Uh, and so we've, we've just decided that that was fine. I thought that was hilarious.
Amos: So, wait a second. So that means that if I have code out there, that's open source, but all rights reserved. Like I just put it out there for informational purposes, then they can utilize that?
Chris: Well, you put it on GitHub. So you gave up your rights to like-
Amos: Well, that may be true. I should go read the ULA.
Chris: Here's the thing. Don't be naive. If you put your crap on GitHub, GitHub's do whatever the hell they want with it. It's GitHub's code at that point.
Amos: Like, well, yeah.
Chris: And especially if you're doing it for free and you didn't pay them money, you know, and all that sort of stuff. Like you, you give up your rights. Like, we, we can't be naive about this. They're a company, you knew the bed, you should have known the beds, you were, you know, you were getting into, when you got into it. It doesn't mean that you can't be frustrated, which is where I currently live. But it does mean that like your frustration has to be tempered with the fact that it's like, this is kind of your fault. Like, like you gave them your code.
Amos: But, and where do we, where do we go? Like do, if we are against Copilot, do we also like remove everything from GitHub?
Chris: Well, hang on, let's talk about the problem here, right?
Amos: Ok, ok, sorry.
Chris: So, for me, it's like you, you, so they've trained this, this neural net on a bunch of various open-source applications, you don't get to know necessarily, cause they're not going to just tell you. And on top of that, everyone, there's this thing with neural nets where it's like, neural nets are just matrices with weights, right? Like this is math, right? And so, but there's this sort of infamous thing of like, well, neural nets are a black box and you just can't possibly know what's happening inside of them. And that, and to some degree that is true. That is like a problem with neural nets and debugging them is like, you don't really know how it arrived at the conclusions that it arrived at, but everybody's using that as an excuse to basically be like, well, it's impossible to know where the answers are coming from, whether it's GPL license code or not that, you know, so I guess we just can't know, cause this is neural net and just throwing up their hands and being like, well, it's impossible to know. It's just an unknowable thing. And it was machine learning, which is obviously, uh, you know, the outcome of machine learning is that it's good by default because it's machine learning. It's cool, right. And so like, there's that sort of aspect to it.
Anna: There's so many problems with that statement.
Chris: That like, but on top of that, any code that they used that was licensed at all requires attribution.
Anna: Right.
Chris: There's no possible way to attribute anyone.
Anna: Unless they decided that they just don't care.
Chris: That well, and that's what they've done is they've decided that they super, just don't care and they're big enough that, that, that no one could stop them.
Amos: What they were saying was it's less than 0.1% of the time that it actually uses a chunk of code big enough to worry about attribution because it combines from all different, from a lot of different sources. So, it's only 0.1% of the time that it's even a recognizable block of code.
Chris: Is he a full robot to you?
Anna: Yeah, Me too. Yeah. Amos you're in and out.
Amos: Until-
Chris: Your internet's bad dog.
Amos: I have bad-Yup.
Anna: You sound. Like this.
Amos: That not good.
Chris: I'm going to keep talking while you fix your internet problem.
Amos: uh-a-ah-eh-oh-eh-o-I
Chris: Stop it. So I think part of it though, is that, yeah, so it will copy stuff verbatim about 0.1% of time. That said, so there's a really, really good article. Someone wrote and explained, I think very, I suppose, very succinctly and, uh, simply why for all intents and purposes, nothing that, that GitHub's doing really is going to violate GPL. Part of that is because to be quite frank, like GPL never anticipated that anybody was going to like harvest all of this source code, run it through, uh, effect, uh, you know, a laundry machine and then, and then spit it back out again, like no one anticipated that was going to happen. And so they didn't like cover that, that specific clause or that usage or whatever. But even the stuff that they're saying like 0.1% of the time, it will return like the chunk of code it will turn, they're able to detect that and tell you as the user that that's what's happening and, and the stuff that it tends to, to like at least according to their own white paper, the stuff that it is tending to return that it's copied verbatim is stuff that is kind of universally like verbatim. Like, like it's copying a lot of like header type stuff or like a lot of sort of like preamble that like everyone has copied, right. That everyone's using. Or it's using an algorithm like fizzbuzz, that is so well understood that it would be impossible to say, like you can't GPL license, the, the fizzbuzz algorithm, like, because everybody's written that same algorithm, right? So that's the kind of stuff that it tends to return in 0.1% of cases. And so basically they made a very strong argument, I think persuasively, that it's not going to violate GPL and I'm willing to say it probably doesn't violate GPL. That said, I'm still super frustrated about it. And for me, this is not about like the legal rate repercussions of whether or not they're like sticking to the letter of the law of GPL licenses or any open source licenses. Like I don't write a bunch of open source software and publish libraries so that a for-profit billion dollar company or more can harvest it and launder it through some fricking neural net and then turn it into things that make revenue for them. In the same way that I don't care that what Amazon does by grabbing people's open source projects, ripping them apart, and then like stuffing them back together as a for-profit service that they then offer thus like ripping off the people who actually tried to build that stuff and build a company out of it and still provide open source software to people. Amazon's a bunch of sleazebags like, and, and as far as I'm concerned, like sure, what they're doing is probably legal in terms of like, well, the GPL says that we can do the, you know, these things, blah, blah, blah. It's like, yeah, that doesn't make you right. It doesn't make you not a scumbag. And I-
Anna: And I think we just saw, you know, the letter of the law, like the current recent news events in general, just like how, you know, technicalities allow people to.
Chris: Yeah. Okay. Oh yeah, yeah. Yes. Uh, yeah. I don't know. It's it's like, I don't feel like GitHub is any better, uh, than Amazon in this case. It's like, you're still just laundering code and you're laundering a bunch of code for purposes that those people really didn't intend it to be used for and probably would have a problem with.
Anna: And not really with attribution to the people who did the work right.
Chris: I'm sure they're being fairly compensated though.
Anna: Oh yeah, definitely.
Chris: I'm sure they're going to, you know, super going to be fairly compensated for all those, from that billion-dollar company.
Amos: That's the pre problem, right. Is, is where all this code came from. And then what about the post problem? Like who owns the code that comes out of that? And because as much as I think that copyright around code is super weird anyway, like if everybody is using like a large portion of people are using this, Copilot they, a large portion are probably going to end up with blocks of code that look, like, exactly the same. And then, and then where do we go? And then people get dependent on it too. And they don't improve algorithm stuff. We have, we have all kinds of problems that come from that.
Anna: Also if there are bugs or if there are like, like whose, you know, like -
Chris: Or security incidents?
Anna: Or security incidents that, but then on our shared across, you know, how many different companies
Amos: Do I now get to sue GitHub? If we have a security thing, because Copilot dropped it in? Nope?
Chris: No. They're going to be very they've they're very clear.
Anna: They already have like a caveat they're, basically like, we did all this stuff and it writes code for you, but also this code could contain bugs and security, like security vulnerability, whatever. They take no responsibility. So it's like use at your own risk.
Chris: Well, I mean, they're taking no responsibility for any of this. Let's be clear. And it's really, I'll say the other really upsetting thing is to see all the like JavaScript, uh, thought leaders, you know, being like this is the future. And I'm like, screw you like, like,
Amos: Well that's because NPM like, look how many modules there are. They have written all of the possible JavaScript you can write.
Chris: I shouldn't be surprised that JavaScript people are really like, believe this is the future. Cause since when have those people considered the ramifications of their actions, like something those people consider to be awesome.
Amos: Those people. Those people. (laughing).
Anna: What does that mean? I don't understand.
Chris: Just JavaScript people generally. Like since when have those people have been like, Hmm, maybe I shouldn't do this.
Amos: You just alienated 80% of developers.
Anna: Exactly. He totally did.
Chris: Its fine, its, I'm, it's super frustrating to me to basically be like, oh, the people who are worried about this just can't get up with the times and don't understand the change of the wind. I'm like, no, you're allowed to be frustrated about this.
Amos: Alright, I've got an idea.
Chris: Like you have every right to be frustrated about this, and should be frustrated about this to some degree.
Amos: Let's take all literature that's ever been in the public domain, create the same thing. But for writing stories, like all non, all fiction, and then we will all be fiction authors and you just start out with like a word or two and then let it complete your book and sell it as yours. Like you wrote.
Chris: It’s Called GTP3.
Amos: It's crazy. You're an author at that point, right?
Chris: That's literally called GTP3.
Anna: It's been done, I guess.
Chris: Yeah. I just, I, the other part of that po, that that blog post that I think is funny is that they.
Anna: Which blog post are you talking about?
Chris: The one that sort of made an argument, I'll link to it. I'm trying to remember, I'll find it and tell you what the exact title is just a second, but , they made a very compelling argument for why this is all totally legal, totally buy it. Totally understand that argument. And I think if that's, if that's what you're concerned about, then you have nothing to be concerned about. I think they made a very, very strong argument for like, why this is all totally fine. I, the, the thing that is funny is like they had a throwaway line and now, and I'm being real nitpicky about this blog post, but there's a throwaway line in there that says people used to use a GPL, but, but it's not permissive enough. And so now people tend to use licenses like MIT so that other people can take advantage of it in any sort of situation. And so ML researchers can use their code as a dataset. And it's, it's like, and they basically say the spirit of open source is to share this code so ML researchers can do whatever they want with it. And I was like, I actually, no, I take issue with that. I take issue with that as the, as part of the spirit of open source.
Anna: Right!?
Chris: I did not sign up for, I do not publish libraries for the benefit of the poor ML researchers. Won't anyone think of the poor ML researcher who just needs data to train with. Think of the plight of the poor ML researcher making $150,000 a year to play Factoria with their GPU. Oh no. Who will think of the ML researchers? Who will consider them ? When they came for the ML researchers I said nothing. But you know, like that's the vibe. And it's like, that's not why I work on open source. I don't work on open source so you can like harvest my data to use for whatever purposes you want.
Anna: I think that's the thing that kind of the, uh, the thing that is frustrating to me and a little bit worries me is the general movement forward with something like Copilot is the threat to like the spirit of open source, right. And like why it was created in the first place, right. Or like so much, so many amazing things have come out of it because of the, again, not like the legality, but like the spirit of the community, right. And like what, what, what the intent versus like whatever legal ramifications there might exist for, for working in open-source, right. And I feel like this might actually be a huge, like drawback for anybody who wants to work in open-source, right. It's like disincentivizes, or at least de-motivates, right, the desire to like put stuff out there.
Chris: It's one of those, like, I've talked to Fred about this before, where Fred has said-
Anna: Fred had good tweets the other day.
Chris: Not kiddingly, you know what I mean? Fred has said not in a, not a joking way that he's like, I seriously want to GPL all my code because it scares away people from using it, except unless people are really serious about using it. And they have to like, go engage a lawyer, like, well, if all that code can just get laundered through some, you know, neural net somewhere, and we throw up our hands as an industry and say, well, machine learning is totally fine because, uh, it's machine learning. I mean, first of all, machine learning, it’s not Bitcoin, like cryptocurrency bad in terms of net negative for the entire world. But like machine learning has done.
Anna: A lot of harm
Chris: Unchecked severe harm to the world. And basically everyone justifies it by saying, well, it's machine learning though. We have machine learning. So it must, it must be good. And it's like, yeah, but you're like, but when does the code become racist? You know what I mean?
Anna: Which it has already been proven to be, right. In many scenarios, like code that's being used in like the judicial system, which we know is, you know, fair and adequate in this country, right.
Chris: Yeah. Obviously proven, proven, proven so.
Amos: Uh, no comm-, I'm not making any comments. I'm just going to let you guys make all the political stuff. I'm just going to sit and smile.
Chris: I guess my bigger point is just like machine learning, like, like, like, uh, friend of the show Mitch was talking about literally he made that comment of like, yeah, but when does the co when does Copilot become racist? And it's like, who's to say, it's not already.
Anna: There is that part. And I think a bigger question about that, all of this ML stuff is like, you know, it's like, I, I don't know if either of you have read a book called The Short History of Progress, by a sociologist named Ronald Wright. I think that's the author. Anyway. It talks about how, like, we think progress is good, always, but that's not the case, right. Progress very quickly can become bad. You know, we went from like being able to build a fire to like steel, to gun powders to the atomic bomb, right? Like at some point that progress is not good for the world. And so with AI, like, I think that's just a pattern of it being, just throwing up your hands and saying, well, it's ML it's good. And leaving it unchecked is actually really scary for like just general implications on just.
Chris: And the notion that it's unknowable.
Amos: And it has the ability to be good.
Chris: The false, the false premise, that it is an unknowable outcome. That, that it just wasn't, it's not possible to know how it arrived at these answers. And thus, it can't really be wrong. There's like this sort of excuseability of AI, of AI results where they basically go, well, you know, it's just, we can't know. I mean, here's the thing, all we did was train it on this very specific set of data that we got.
Anna: Which I think is like, it's like, it's like intent versus impact. And it's like, well, no, the intent actually doesn't matter. The impact is what matters. And so like, that's like not, not, not an okay argument. Right.
Amos: So I'm curious, like if you trained it on, on something and maybe it doesn't have a specific thing, piece of code in it, but it ends up writing, matching some code that wasn't even part of its training set. Does that, a copyright violation?
Chris: Probably not.
Amos: Like if you know, what, what is that? If I have a thousand monkeys typing and type out Shakespeare, right. Then have they violated, well, you can't buy like Shakespeare copyright cause it was too long ago, but you know what I mean? Like have they violated a copyright then?
Chris: Uh, I, I don't think I have to imagine that what it's outputting is so (sighs). First of all, like I haven't seen a single, uh, a single compelling example of this yet outside of toys. I've only seen, and I, part of that is because it's like only available to Microsoft and GitHub employees at the moment, or like, you know, thought leaders, uh, or whatever. And so-
Anna: I don't think you could say that phrase with more disdain Keathley.
Chris: No one's uh, yeah, yeah. No one has shown me literally a single compelling argument for why this is, this is more than a toy. It's like, I don't care that it can autocomplete Fibonacci. Like I don't care that it can autocomplete fizzbuzz . That's not interesting to me. And so no one's shown me anything that's beyond toy level. Beyond that I have to imag- I very much suspect that what it's outputting is tends to be wrong or like slightly off, and you're going to want to change it anyway. And if it happens to wander into some, some existing thing, I don't think there's any way to prove it. I don't think there's any way to prove that your intent was to like, copy that thing.
Anna: No, I agree.
Chris: Or that, and you obviously can't prove that the algorithm's intent was to copy that thing. But that's part of the problem is it's just an algorithm. It's just a bunch of dumb numbers that are, you know, determine an outcome. So I don't know.
Amos: It's the, the premise they started with of pulling in all the open-source code. That's where, that's where I'm like, umm, unless people gave you permission, like that's not, I don't think the intention. So I wanna walk away from that.
Chris: Apparently it works on Elixir.
Amos: So that means that they was, they got, they got two Chrises and a Jose worth of code shoved in there.
Chris: There's at least two Chrises. Probably more. Uh, yeah, but I mean, I, I imagine it works with Elixir, which, which means statistically speaking, some amount of code that I've written helped to train that piece of crap. Like that's just statistically, there can't be that much Elixir on GitHub. You know what I mean? So statistically speaking, some of my code probably ended up training that thing.
Anna: Yeah, totally.
Chris: Also, while we're on the subject, let's talk about a very practical problem, a very, very practical problem. I've seen a lot of the Elixir code on GitHub. You think the formatter is bad? Woof. Like I don't want to put that and that's not, that's not the exemplar we should be using.
Amos: I need to breathe.
Chris: This is what you want in your repo? You think these are good suggestions? I'm here to tell you, I don't trust it.
Anna: Oh man. I mean, I think that's like just the larger issue is if it's pulling in a bunch of like random open source projects, right? Like how many open source half-baked unfinished thought experiment projects are there on GitHub, but isn't like code you would ever want to use in a production level system.
Chris: Yeah.
Amos: Most of it.
Chris: Yeah. Yeah. That's what I'm saying. That's what I'm saying. There are very popular Elixir libraries that are half baked. Let's be clear on that.
Amos: Stay off of my Github. Stay off of my Github account.
Chris: Look, I love, I love this community. I love being a part of it. It's great. But also, I've seen a lot of that code, y'all Like some of that stuff's not finished and some of it's like, I don't want this to be, I don't, this is not the pattern. I mean, imagine if , here's the thing, I think I might be the only person who publishes like open source libraries who doesn't use the formatter, which means that most of their code is using the formatter. And then what if it like, looks at that and goes, yes, all functions need line breaks for all the arguments. Is that what you want to be putting in your, in your, in your repo? I don't think so. I've seen some of that nonsense.
*both Amos and Anna laughing*
Anna: Oh, Keathley.
Chris: You don't want that. So that's like on top of like whatever sort of societal ramifications may come out of this Copilot thing, I do not, I do not believe that it's going to be all of that, I do not believe the efficacy of it yet. No, one's shown me that that's that that's true.
Anna: Well, yeah, that's a, that's a big open question, right?
Amos: Here's what I, I want a documentation Copilot though. I want it to look at my code and just write documentation.
Chris: I have one of those. It's called. It's called my brain.
Amos: No, I don't want to have to type it.
Chris: Yeah. It's called not being lazy, I guess.
Amos: Oh.
Chris: I don't know. Like it's called not believing that computers can solve all of our problems.
Anna: Which is why we have brains. Yes.
Chris: Human beings, as it turns out, are actually pretty good. Like the least discerning human being is more discerning than the computer.
Anna: This is true.
Chris: So, you know, let that sink in for a second.
Amos: I was going to start a new consultancy totally based on Copilot. You just type in the name of your project and it writes it for you.
Chris: Also, I'm pretty sure they're like, sherlocked a company doing this. Isn't there like some other AI, GitHub repo source and machine learning company?
Amos: Oh I have no idea.
Chris: Yeah. It's called like Tab Nine or Nine Tabs or some crap like that. I think it also uses Stack Overflow. They pull stuff from Stack Overflow and then train their models based on Stack Overflow.
Amos: Oh gosh, that's worse than GitHub.
Chris: Yeah. No, no, no, no, no. Go look this up. Someone, someone verify this for me. I think it's called like-
Amos: Oh my gosh.
Chris: It’s not Plan Nine, that's a different thing. But it's, you know, it's, it's like Tine Tabs or Tab, Tab Nine or something like that Tab 11, I don't know. It's it's like tab and a number, but it's AI. It's an AI assisted code review tool or co-generation or something like that. So I'm pretty sure GitHub like sherlocked this anyway.
Amos: Tab Nine.
Amos: Tab Nine. Is that what it is?
Amos: Yep.
Anna: Oh man.
Amos: The word not the number. Oh, it totally is. I wonder where they train this. Anyway. I-
Chris: I think they use Stack Overflow amongst other sources.
Anna: Ooooh God, because that's a good, um, tool.
Chris: Where was that bot? Wasn't there like a bot or like a VS code plugin that would just grab answers from Stack Overflow and then jam them into your code.
Anna: Oh god!
Chris: I think that's a real thing too.
Anna: That' what you want. That's definitely what you want.
Chris: There's a VS code plugin for everything. Isn't there like a Tamagotchi VS code plugin?
Anna: I don't know. It takes me back to being like five years old.
Amos: We don't have to worry about it. Tab Nine is a, it opens up originally to JavaScript and then Java. So we, we already know that like you can't listen to JavaScript programmers. I've heard that from Keathley. So.
Anna: Y'all!
Amos: There, therefore-
Chris: I mean, you can listen to them if you want to.
Anna: These blanket statements are mildly problematic. Oh man.
Amos: Um, they're meant for entertainment purposes. Not for reality.
Anna: Uh-huh.
Amos: Sorry.
Chris: But I believe that. No, it's just that, it's the mentality of very specific, uh, thought leaders taught leaders who happen to also be JavaScript, JavaScript boys who are very like, "This is the future. You, if you, if you're not, if you see how this is the future, you're just dumb" or whatever. And it's like that mentality, that constant, like there's gotta be something better, don't even worry about the ramifications of that is, is annoying.
*Anna laughs*
Chris: What are you laughing at?
Anna: Amos's questions.
Amos: I'm back channeling your conversation.
Anna: Yeah, Amos and I are backchanneling about you.
Chris: Listen, if you gotta to say something about me, coward, you can say it to my face.
Amos: You want it recorded?
Chris: Sure. I don't care. I don't even know what you said. I go on do not disturb during recording.
Anna: He said that people call you a thought leader.
Chris: Oh god.
Amos: Just so you know.
Chris: Stop it!
Amos: I just wanted to let you know. I've I've heard you referred to as a thought leader. So.
Anna: He definitely has thoughts.
Chris: Yeah-
Anna: Keathley definitely has strong opinion. He is not lacking in strong opinions.
Amos: That's why I like people with strong opinions.
Anna: Me too.
Amos: Yep.
Chris: I'm sort of more like a thought tour guide.
Amos:(laughing) Tour guide.
Anna: Ok, stop. nope.
Chris: I don't really lead you anywhere. I just show you some sites.
Anna: Nope. Nope. Oh my god.
Chris: I walk around at conferences. With my flag. As I walk around showing you the various sites.
Anna: I'm getting you a flag, for the next time that I see you at a conference.
Chris: Everybody, everybody here now.
Anna: No, we have to stop.
Amos: This is the room where they pipe into case statements. We're not going to go in there.
Chris: This is, this is a special circle of hell, reserved people who pipe into case statements. And talk at the theater.
Amos: Yeah. The only people worse than the people talking at the theater are the people who get on a phone call at the theater
Chris: The people who pipe into case statements are the same people who like, uh, are the people who ask questions at the end of a talk that are like, "Well actually, um, you see you're wrong about this part of it. And I don't know if you know that, but-" Those are the people who pipe into case statements.
Anna: All right. All right y'all. I have to jump. This was hilarious. It was lovely to see your faces. Y'all should keep going.
Amos: Missed you Anna.
Chris: See ya.
Amos: I don't think I can keep going. My cheeks hurt.
Anna: Bye y'all.
Amos: Have a good one. Bye. See you Keathley.
Chris: Later.