The Elixir Outlaws now have a Patreon. If you’re enjoying the show then please consider throwing a few bucks our way to help us pay for the costs for the show.
Amos: Welcome to Elixir Outlaws, the hallway track of the Elixir community.
Chris: We're good.
Amos: We're good.
Chris: Thanks for having us on your show.
Anna: Yeah, thanks y'all.
Chris: It feels like you all should know what you're doing at this point.
Justus: Well, isn't that how the show goes?
Chris: That we don't know what we're doing.
Justus: I mean, I don't know what I'm doing.
Amos: That they don't know what they're doing, Chris, you gotta polish this.
Sundi: That we don't know what we're doing.
Chris: Oh my God.
Justus: To be clear, I am Chris Keathley.
(Bursts of laughter)
Amos: I don't know that there's enough room for two of you.
Anna: I know, right? Oh my God.
Chris: We had to widen the doors.
Justus: Okay. Well, welcome to the show, everybody. So glad to have you on. I know that you don't actually do welcomes or anything like that. Um, so-
Chris: -You've already failed. (laughing).
Justus: So I'm already butchering this impression. Already butchering this impression.
Justus: Somebody. I asked, I asked, uh, I went in one of the Slack channels. I was like, all right, guys, I'm going to do my best Chris Keathley impression, what do you guys think I need to do? And they were just like, "Well just write a bunch of libraries."
Chris: (laughter) Caveat. You need to write a bunch of libraries that nobody uses.
Anna: That's not true.
Chris: Just continually produce more libraries.
Justus: Actually, I will, I will. I will. Uh, I will, uh, uh, what is it called? I will flatter you even further by admitting it was Greg Vaughn who said, "Write some awesome libraries."
Chris: Oh, wow.
Amos: There you go.
Justus: So that's a high compliment from a high person.
Justus: I mean, I don't think he's high literally. (Laughter) Figuratively high not literally high. Um, sorry, Greg.
Sundi: And for people who can't see Justus, Justus is physically doing his best Keathley impression, with an Outlaw shirt and a backwards cap that, um, matches.
Justus: I even grew my hair out for this. Over a whole year. And a little bit of five o'clock shadow. But, uh, it's the best I could do.
Amos: It's not a Chris Keathley five o'clock shadow, but it'll do.
Sundi: Well, but you know, it works.
Justus: I started thinking about this a year ago. I was like, “I'll grow my hair out.” And I started thinking about the five o'clock shadow at five o'clock this morning.
Chris: There you go. Nailed it.
Justus: So, uh, my understanding of the show is that normally you guys banter off topic, and then you, you mentioned library. So, um, Axon, you guys heard of this?
Chris: Uh, Yeah, it's a, -
Amos: Wait. You might be too- Wait a second. You are too early in the show.
Sundi: We have more bantering. Yeah.
Sundi: It’s too early to be talking about real things.
Amos: What are you doing?
Justus: Sorry. I'm really new to this. I'm really new to it.
Chris: What? Having conversations with other humans?
Justus: Specifically the humans part. Inanimate objects, I've got plenty of experience. Lamps in particular.
Chris: It's a real like Wilson situation over there.
Justus: I've found that vegetables make the best conversationalists.
Chris: Yeah. That makes them happier, too. And I think they grow better.
Sundi: Is that how that works?
Justus: They'll grow into my belly.
Chris: Yeah. Yeah. There's like a, there's probably science.
Anna: Is that why I can't grow anything? I don’t talk to them?
Sundi: There's probably science. Nice.
Justus: Yeah. This is like, this is like any, any of our like modern sciences. They get disproved within like a century. Um, do you guys hear about this Amy Cuddy thing? Wasn't her whole body language thing, disproven?
Sundi: What? What thing? No, I didn’t.
Amos: What are you talking about?
Chris: I don’t know.
Justus: Uh, Amy- so you've seen that Ted Talk where she goes and she says, “Oh, you want to do the Wonder Woman pose to feel good before you end a meeting” or something?
Chris: I've heard that before.
Justus: Yeah. Yeah. And I guess it got debunked because it's social science, so, like necessarily it will always be debunked eventually.
Eric: It's like the ironclad law of social science, whatever you find just 10 years later, someone will try to duplicate your findings and realize it doesn't work.
Amos: Well, and society changes too, right?
Chris: I was gonna say, it's computer science papers that talk about the efficacy of any given, um, you know, technique or a tactic. And then it's always caveated with as tested on the 10 undergraduates who showed up to my study that day, like, you know-
Eric: We need a larger sample size of college students.
Chris: Yeah. I, uh, I found a cross cutting section of the five people who would talk to me for, to do this study. And I really feel like that is an exemplar of why TDD doesn't work
Justus: To be fair. The hard scientist sciences are also notoriously unreliable. If you're looking at graph of the speed of light over time. (Makes disapproving noises). They call it a constant.
Chris: Uh-uh. Uh-huh. Uh, and never don't, don't do any research into the efficacy of double-blind studies either. That will break you. It's a little bit like-
Eric: Don't do the research!
Chris: -A little bit like learning, like reading a little bit too much about anesthesia where you're like, Oh, oh no, oh, oh, this is terrifying. Like you, you know, you're saying to die.
Eric: So you're saying I die?
Chris: Yeah. So you're saying, I'm just dead.
Amos: We don't actually know how this works. We're just going to stick it in your body. And usually people come back.
Sundi: Hope for the best. Yeah.
Chris: It's like, electromagnets, still. Like, uh, like when you study electromagnets and EE, its bas- you're, you know, you people tell you that there's math and there is math to some degree, but most of the math is look in this book and we just had a bunch of undergraduates take these different metals and then wrap them with wire various times and test them a lot. And that's what we know.
Justus: So, so I got another suggestion from the audience about how to impersonate Chris. And so this is an exact quote. It says, “You need to say that ‘I've been doing a lot of thinking on this and I declare that science is not a thing.’"
Justus: So...And by the way that it wasn't science, it said "insert noun here." I'm pretty sure science is a noun. So.
Chris: Yeah. Nailed it.
Justus: Killed it.
Eric: Crushed it.
Justus: We can go, everybody.
Chris: That is probably, that is probably something I do. Oh, crap. This is like, when you learn, you say certain things. Oh, no.
Amos: Refactoring is not a thing.
Anna: (chimes in) Not a thing.
Amos: Business logic.
Anna: Not a thing.
Chris: It isn't, though. But neither of those things are things.
Sundi: It must be really weird to like, it is weird when you actually study what you commonly say.
Sundi: Um, I noticed when I used to be like a big Photoshop person, if I were Photoshopping somebody's face, I'd get to know their face really well. And that was like weird for me if I was like doing headshots from my coworkers or something. But then the, the audio side of that, um, I was just testing out the script the other day, the like podcast editing thing.
Eric: Oh, yeah.
Sundi: And it, it, it prints out a transcript of everything you say and the way you talk and the way you write is so different. Um, apparently I say, "so you know" a lot instead of "um" or "er" or "like,? and, um, Justus repeats a bunch of words before you get started. And Eric-
Justus: We go more into that. Cause you said this to me one time and I was wondering what you meant that I repeat a lot of words.
Sundi: I'll have to go. I'll have to, uh, if you were to say a word, like, let's say the word research, you might say r-r-r research, like before you get into-
Justus: Oh, so I have a stutter?
Sundi: I mean, it looks like that on, on a transcript.
Justus: Are you belittling my speech impediment?
Amos: I'm not! I'm, by science, looking into the way we all talk.
Justus: Oh, wow. I'm, I'm traumatized. My childhood trauma has been lifted to the surface for all to see, my soul bared before you.
Sundi: I was traumatized by this! So, you know, I said it 90 times in the course of one episode. That was my transition. That was how I started a sentence.
Justus: So, you know, I had a childhood stutter. Speech impediment, so-
Chris: 90 words in the entire episode is basically how many words Amos was able to get in in an episode.
Anna: And that's because he has very strong opinions.
Amos: And most of those are "er", "um"
Chris: No, I was, I was, yeah. I own this. I- listen, at this point, I'm aware of the store at this point, I’m aware of the nature of this podcast. I talk a lot.
Amos: I think it was episode one where, like, right off the bat, like we got like five minutes in and you said, "Well, actually," and Anna just died laughing.
Eric: Please tell me he had glasses on and pushed them up at the same time .
Anna: You did.
Amos: I got the vaccine today.
Justus: Aw, nice.
Chris: Congrats. Did you, uh, were you immediately compelled to buy a Zune?
Amos: Uh, no. They told me because I'm allergic to bee stings, I had to sit down there for 30 minutes. And so I was immediately like, “Uh, now what am I going to do?” They had me stare at a wall and sit in a chair.
Chris: So you weren't able to buy the Zune yet?
Amos: No. No.
Amos: Not on the Zune.
Justus: A what?
Chris: A Zune.
Amos: From Microsoft.
Sundi: The music player?
Chris: Let's, let's skip this. Let's just move on.
Justus: Oh, oh! Oh! From back in the day. Yeah, yeah, yeah. I remember. People thought that that was like a cool thing. That was like the, like the Nokia of a MP3 players.
Chris: My ex-girlfriend, uh, had one.
Sundi: I had one.
Justus: Is that why she's your ex?
Chris: Uh, no, she's my ex because she cheated on me with my best friend.
Amos: This just got real.
Anna: This just got really real.
Chris: I'm pretty, sure those things are correlated. I feel like.
Eric: I feel like, okay, on our show, this would be the point where we say-
Sundi: -That's where we edit that out.
Anna: Correlated because science, yes, Keathley?
Chris: Yeah, no, I feel like there is, I personally feel like there-
Anna: Science. Definitely science.
Chris: -Is some amount of correlation between Zune and infidelity.
Sundi: I, okay, wow.
Eric: You called her your ex, but you didn't call him your ex- best friend.
Sundi: I have to say, I have to say, as someone who owned a Zune, for me, it was more about like, I didn't like that everyone who had an iPod, it was like, so cool. And I was like, “Alright, you know what? I'm going to be different. I have the one thing that nobody else has.” And I liked it. I actually liked it, I think, um, was it, uh, iTunes? Didn't let you put on your own music or, or something? Was that what that was? I can't remember now. I liked, a lot of my friends like made their own music, so I got to put it on there. On my MP3 player. Dare to be different.
Anna: That's pretty cool.
Chris: Care to be- yeah, don't, listen-
Amos: Think different. Zune.
Chris: Don't let anyone else keep you down.
Justus: Either that or visual studio code. I assume you installed visual studio code at this point after your vaccine. Post vaccine.
Amos: Uh, I'm about to, uh, everybody I work with is using it and you know, I use Emacs and Vim, like for 20 years I've used the Emacs and Vim, and now I'm like, okay, maybe visual studio has code has ended the, uh, editor wars.
Eric: On what basis?
Amos: It does everything except for org mode. So, like, it's um, and the sharing of code at like both people typing at the same time. All of it. It does it without configuration.
Anna: Yeah, the live sharing is pretty good.
Amos: Like, it's pretty amazing.
Anna: Live sharing has been there for a few years. I used that in 2018.
Justus: We'll have to come have you guys all back on our shows so we can disabuse you of these false notions.
Anna: What are your feelings on things, Justus?
Amos: I'm still using Emacs, so-
Justus: I'm only aware of one text editor. So-
Amos: Notepad? Plus plus.
Chris: No, no. Plus plus plus, come on now.
Sundi: Eric, what do you think the breakdown is at Smart Logic of people who use vs code? I feel like it's more than half.
Eric: Well, there's people that use Vim and then there's-
Chris: -Everyone else.
Eric: Yeah, I think, there's uh, four of us that use, three or four of us that use Vim. And I think the rest are vs code. There was an Adam straggler for a while. Oh, and we have, we have one person who has, uh, she has VS Ccode is like her main editor, but she has like sublime text as like a notepad-type thing. And it's just, yeah, it's always fun to see that slide in and, and like, yeah.
Chris: Oh yeah, I remember that.
Amos: You have all the online editors that are being based off of what, Quill, now. So they have all the same shortcuts as VS Code. Like the new Livebook thing that just came out has all of the shortcuts of VS Code.
Sundi: Ooh, yes. See I didn't even, see, that's N4 on Elixir.
Amos: So, once you're learning all these shortcuts, you might as well apply them everywhere you can.
Sundi: So Livebook-
Justus: What is, what is the, too long didn't read on Livebook?
Amos: Have you ever heard of Jupyter Notebooks?
Chris: It's Jupyter Notebooks.
Amos: Yeah. Yeah. That's true.
Justus: Oh, sorry. I forgot. This is your show. You guys probably don't do this, like, oh give me the context.
Chris: No, no, no. We do.
Anna: We do.
Justus: Oh, you do. Okay. So now, I mean, I knew about it. I know what Jupyter Notebooks are.
Chris: I mean, in case anyone is not-
Chris: In case anyone is not aware, uh, Livebook is a new thing in the Elixir world. And, um, I have not used it yet, so I cannot, I don't, I can't claim any amount of actual knowledge. I've seen what's been done elsewhere.
Justus: Once you've done, once you've done some thinking about it, I'll let you can declare it "Not a thing."
Chris: Uh, no, no. It's a thing. It'll be, it's totally a thing. It seems like Jupyter Notebooks. I don't know, but it's Elixir, right? And I don't know what the difference between it and Jupyter Notebooks is and any of that, but.
Eric: It's just going to be better.
Chris: Probably, right. I would assume.
Eric: Just better.
Eric: Better, faster, stronger, less well-known.
Sundi: Let's just say I will mysteriously note that something fun may be coming as a result of Livebook.
Amos: Oh, nice. I’m excited
Anna: That's cool.
Sundi: That's all I have.
Amos: That's all you're going to say?
Justus: This is like, this is like a Barnum effect thing where you just predict an obviously true thing that's going to happen. Like, oh, this new technology will result in other new technologies. Don't ask me how I know. Don't ask me how I know.
Sundi: Um-hmm. Don't ask me how I know. How'd you know what I was gonna say Justus?
Chris: What, is this tarot card reading but with technology? Like-
Amos: We have one of the people at the office that stood it up on a server and is using it to keep notes when we're working on projects. So that's, that's pretty nice that everybody on the project can hop in and you can write code that runs right there.
Justus: Whoa, they did that in like, a day.
Amos: They released it in a day, yeah. Did they write the whole thing in a day? Is that what you're saying?
Justus: Yeah. No. I mean, like I just heard about this, like a couple of days ago. I feel like if he stood it up on a server, you must have just like immediately.
Amos: Oh yeah, he got it within like an hour, he had it running in a Docker container on his local computer because he didn't want- it writes your file system. And he didn't want people to be able to write to the file system. So now.
Justus: I feel you should name, drop this wizard.
Amos: Chad, Chad Fennell.
Justus: Yeah. He deserves it. That's pretty good.
Amos: Yeah. It’s pretty neat.
Chris: Yeah. That's cool. Yeah. I need to get around to it. It's like that, how, how, how, what have we landed on in terms of how do we pronounce N. X.
Sundi: Exactly like that.
Chris: N.X. we're not calling "n-i-x".
Amos: Oh, I thought it was just X. I thought it was a silent n.
Justus: Well, Nx is a thing. So,
Chris: Yeah, but-
Justus: We don't want to overload.
Amos: That has not stopped anybody from creating anything with the same name.
Sundi: That's true. That is so true.
Justus: I've thought about it a lot. And Nix is not a thing.
Chris: Okay. NX is what we're going with. I got that cued up. I got that cued up as well. I haven't even tried it out yet.
Chris: (Starts freestyling). So it seems like, it seems cool though. Like, it seems as much as it is probably a thing that like, I will, I don't know. I need to get in there, but as much as the thing that doesn't immediately grab me is like, I need to install this today. It seems very cool.
Sundi: I do wonder if this will, like, I'm always thinking about what's the next thing that will help more people make this more accessible to more people. Every time I see something new, like LiveView was the thing, I was like, “Oh, people are going to like that as an alternative to React.” And then this came out and I was thinking, Oh, this well, people will like this because they'll get to play around with Elixir more and then get to know it. And maybe they'll be interested in looking at it more. Maybe it's like a snippet, a teaser. I don't know. I haven't used it too much either.
Anna: Keathley, why are you shaking your head?
Sundi: No, you don't think so, Chris?
Chris: No, I don't think so. Um, I don't think people, what's the right way to say this? Early adopters adopt stuff, because it's cool. Um, early adopters make up a small subset of the actual.
Sundi: Because it’s cool?
Chris: Yeah. Because it's cool. Early adopters adopt stuff that they're interested in it, right. It's, by definition, they are definitionally early adopters of things, right. And they self-select into stuff. And then they self-select out of stuff into the next thing
Sundi: That that's a different subset of people that I'm talking about though. I'm talking about like right now, we're all Elixir Evangelists and our friend comes along and is like, oh yeah. I was thinking about getting into that. I don't know where to start. I don't know, like what's a 30 second teaser for me to like, figure out Elixir? If there was a Livebook that you could just like send somebody and have them tinker around with it without installing anything.
Chris: Oh, I see!
Sundi: Doing any like, uh, like, uh, there's a developer who, like, may be interested. And for us, the evangelists who are always pushing Elixir down our dev friends, um, you know, does it make it easier for us?
Chris: I, that that's a different thing than I was thinking. And I think, yeah, in that regard sure. Barrier lowering the barrier entry. Super good. My general, what I was getting at is like, I don't think people, I don't think pragmatists adopt languages because of language features. I think Le pragmatists adopt languages because of other pragmatists, like, um, and so like any new cutting edge technology thing that you add to a language is not a marketing vehicle for pragmatists, because that's not why they're there. What it is a marketing vehicle for is all the people who love your thing already. And you get them further excited and you hopefully use that to attract more practitioners, right. Because it becomes safe to use it. Um, that's my general feeling about it.
Justus: Do we want to talk about Axon since we're kind of here?
Chris: What is it? I don't even know what that is.
Justus: Oh my gosh, it’s so cool. Okay. So someone took an Nx like this is, I mean, I just love these people that like a new thing comes out and they're immediately on top of it. Uh, so I guess it was Sean Moriarty maybe.
Justus: I mean, that sounds right.
Sundi: Yeah. That was right.
Amos: He wrote the genetic algorithms and Elixir book.
Chris: Elixir Sherlock Holmes.
Justus: Right. So I, I hope it was him, he took Nx.
Sundi: Thank you for saying it.
Justus: Sherlock Holmes reference. Slid it in there. Elementary, my dear Watson, something, something about Baskerville hounds. Okay. Uh, Axon. So Axon, uh, is a, is a neural network library built on top of it Nix.
Anna: That's cool.
Justus: Uh, so it's yeah. I think it's like TensorFlow and Elixir is I think what they're going for. So yeah. It looks super cool. Uh, very well-documented um, really nice syntax. Uh, they've already got a number of algorithms built into it. It looks pretty great. Can't wait to use it. Um, I want to get Sean on the show ASAP.
Amos: We don't have a guest.
Sundi: Our show? Which show?
Anna: Yeah, which show?
Sundi: Which show? Our show or your our show? Eyebrow raising over here.
Amos: Eyebrow raising makes for great audio.
Chris: All the shows.
Justus: I feel like the audience already knows. Like, they can just, they can telepathically sense the vibe.
Chris: Yeah, the, um, yeah, all that stuff seems super cool. Like all the I'm glad that somebody is doing all the ML stuff, like making that, making that story really approachable in Elixir, that's cool for me.
Amos: We're looking into it now. We've been working on a camera system for a while that detects falls in hospitals. And currently all the ML stuff is using Python, Jupyter Notebooks, and some of the stuff now, not all of our models can be transferred over out of Python, but some of the stuff can be transferred over out of Python and into Elixir. And I think that's, that's the process that we're going to try to move to going forward. Just so that there's one stack instead of, you know, 20. Or 2.
Anna: Twenty, two same thing.
Chris: You could certainly do that before, right? Like, I have friends who built a, a effectively, a control plane in Elixir to serve, uh, the online results from a TensorFlow model that they like, pre-baked using Python and a bunch of GPUs. And then you just, they wrote C bindings for, in a NIF for all of the, all the Elixir stuff. And they served all the online results with Elixir, just calling into TensorFlow via NIF, which is like a thing you could do before. Um, but it's cool to be able to do the training aspect of it now as well.
Anna: Yeah, that's true. That's cool.
Eric: We worked with someone who did that, but in Rust, I think.
Chris: That sounds about right.
Eric: And then they also stopped doing that. And I don't know.
Justus: Um, cause this is kind of also related, Eric, this article you posted in one of our Slack channels, there is a pair, first of all, do you want to give the affirmative case for what this article is saying? And then I will-
Eric: Uh, then from last night where machine learning is a marvelously executed scam.
Sundi: Oh, I thought that was a dream. Did I read that last night and then go back to sleep? That sounds right. I was like why does that sound like my dream?
Justus: So Eric, you give the, you know, TLDR, uh, affirmative case for what they're saying, and then I'm going to read one particular paragraph out of it. And I want everybody to tell me if any of the statements that this person made are true.
Amos: Are you getting ready to read that paragraph?
Justus: No, it's not a too long. It's pretty short paragraph.
Amos: I was gonna say, this is good audio.
Justus: It's from, uh, last week in AWS, which if you don't follow Corey Quinn on Twitter.
Chris: So good. So very,
Anna: Yeah, he's hilarious.
Justus: So the, like the, the gist of it is that like the Amazon recognition doesn't talk about how it helps as a business case. It just says we got lots of ML, it'll help you. And like, that's about it. And so that it's like the more, the more a marketing website says we got ML versus , not like, here's how it helps you, the more of a like useless thing it is, is the takeaway.
Anna: That sounds about right.
Chris: That sounds right.
Anna: It's like big data conversations from, like, 15 years ago.
Amos: I don't have any idea about any of that.
Justus: Okay, cool.
Chris: Here's the thing I've thought about this a lot guys. And I'm pretty sure AI machine learning is not a thing.
Justus: Savage. Savage. Alright. So, the three central claims that he's making to make this point. And you can tell me if any of them are, are true. Okay. So he says to use AI slash ML effectively, you need three ingredients that are universally agreed upon by everybody. First is a vast quantity of data, which you will invariably pay your cloud provider an eye-wateringly large pile of money for. The second is a lot of compute powers specifically GPUs, which are specialized form of compute and costs significantly more. And the third is people who are trained in this arcane form of wizardry who are a lot like regular software engineers, except they cost a lot more money.
Chris: Yeah, all three of those, all three are super true. Yes, totally correct.
Anna: All three are true.
Justus: Okay. So really?
Justus: Cause I don't.
Chris: A hundred percent.
Justus: Who goes to AWS to buy the data. I mean, like-
Eric: -Well no, it's hosting-
Anna: -Hosting the data, yeah.
Eric: Yeah. So you're, you're, you're putting like terabytes or petabytes or whatever, uh, data in, in their cloud, which costs like-
Anna: -A lot of money.
Sundi: And the data, the data, be the data.
Amos: It’s really cra-
Justus: And the third is that software engineers are basically, or data scientists are basically software engineers, but cost more.
Chris: Yes. A hundred percent.
Chris: Yeah. A hundred percent.
Justus: I don't think so.
Chris: How do you have, how many, how many data scientists have you hired?
Justus: I've worked with many. I mean-
Chris: -Cause they're so expensive. They're not IOS developer expensive, but they are super expensive.
Amos: Yeah, you should, you should compare paychecks sometime.
Justus: Well, no, I know they get paid more, but I also feel like they're, because they're doing something that's just harder.
Chris: Oh, it’s not hard. No, absolutely not.
Sundi: Also, hold on, there are different kinds of data scientists, though. We just talked about this on our show, about how, like, data science is a very large enveloping, like all encompassing word now for a lot of different people in different fields. There are some data scientists that aren't, who don't consider themselves engineers at all.
Justus: Look, I'm saying this out of all humility, which you'll never get from me ever again, it is a lot harder to do it like advanced math than it is to spin up a credit application.
Chris: You- that presumes that -you -okay. So, okay (makes incoherent noises). Rewind.
Amos: Well actually-
Anna: Well, that assumes that you're doing a lot, like a lot of the, a lot of the algorithms that we use that are used for ML these days are created by a very small percentage of people. And those people legitimately make a shit-ton of money. A lot of money. Oops.
Chris: We curse on ours sometimes. It happens.
Anna: Yeah, that's true. I can't, I'm trying to remember, um, they make a lot of money, but everybody else uses them, uses those algorithms and those people aren't the ones coming up with the algorithms, right. They're effectively kind of credit apping, a different way, right. They're using existing algorithms, the data-
Justus: -Yeah, I feel the floor is a lot higher. I don't know. I feel like the floor is a lot higher. Like I've met, you know, don't, I love software engineers, but I feel like the floor of getting into web development is like somewhere around here and it's high compared to like, I don't know. I don't, man. I don't want to like, uh-
Chris: Do you want to- you want to pull the rip cord on whatever it is you're about to say?
Sundi: I want to see him pull the ripcord.
Justus: No, because I've got a ton of respect for like, the trades. So I was going to say something like roofing, but then I was like, “Ooh, roofing is probably really hard and I don't know anything about it.”
Chris: Roofing's way harder than building a (inaudible).
Anna: Roofing's way harder, exactly.
Justus: All right. So, uh, I don't know. There's no easy jobs. Um, um, I like being a politician is like somewhere way down here, right. And being like a software engineer is probably like several orders of magnitude above that. And then being a data scientist is like, the floor has got to be slightly above that in the software.
Chris: Yeah, no, absolutely not.
Amos: No. No.
Chris: No. I mean, it depends on what you quantify as like, it, it, it, it's all the same thing of like, I don't know.
Justus: Do you have some counterexamples? Like, are there some really dumb data scientists?
Chris: Yes. I mean I don't, I can't. I mean it’s-
Anna: -No! It’s just-
Justus: -In your life?
Amos: Please name them.
Chris: Listen, statistically speaking 50% of people are below average.
Sundi: Oh my god (laughing)
Amos: They have to be.
Justus: Yeah. But are 50, are data scientists below average?
Chris: What I'm saying is that a lot of people I've worked with a lot of these people.
Justus: I feel like here's the bell curve, baseline is over here-
Amos: -Fifty percent of, of data scientists are below average data scientists. There you go.
Justus: Right. But as far as people go, they're like, way at the top, like-
Chris: -There are, nah, as far as no, I mean in-
Justus: -Nah, top 25%. I'll say that.
Chris: TensorFlow is so weaponized that, like.
Justus: You have to be one standard deviation above average to be a data scientist.
Chris: Here's, here's, here's my pushback on that. Um, most of the time data science teams work in their own vacuum where like, they are not embedded in the rest of the company, thus, they can do whatever and you justify it by saying it's data, right. And so you can kind of get away with doing a lot of things, like including building models that may or may not work, right. Like, I've worked with those people who just built models, but building a model really just meant how do I wire up Python into my Hadoop Cluster with Storm so that IR or Spark or whatever, so I can like run all these, all these jobs. I mean, there are truly people, to Anna's point, there are people who are out there basically inventing a science and those people are doing incredible, like interesting work. I don't disagree with that. I think the rank and file amount of people doing like, quote unquote deep, you know, deep neural networks or stuff, it's like, no, dude, you need like naive Bayes for this. And like that-
Justus: -Yeah but isn't that also true for software?
Chris: Yeah, absolutely. So we're saying the same thing, right? Like it's like, there are-
Justus: -Well, no, because I'm saying that, oh, we've got a normal distribution of data scientists and we've got a normal distribution of software engineers. And the, the distribution for one, I mean, don't get me wrong, you know, the smartest software engineer is definitely just as smart as the smartest data scientist, but the average software engineer, I mean, look, it's just, I mean, I almost like mathematically, of course, you're going to have way more people now in software engineering, broadly speaking, and the average would be lower.
Amos: I think that you can take a high school student, any motivated high school student and turn them into a starting position data scientist in the same amount of time that you can turn them into a starting position web developer. Eh-
Justus: -And I think that the bar for understanding what you're doing in data science would preclude a lot more people. Like the perimeters are just more complex.
Amos: But that’s the thing is, is for most things, you don't have to understand a whole lot to do it because of the tools in the background, don't require you to understand. How-
Justus: -I really thought my humility here would go over really well with you all.
Chris: Counterpoint, counterpoint, counterpoint. How many people build a log in page that don't know anything about crypto?
Justus: How many people can create a login page that don't know anything? Literally 99% of them, right?
Chris: Yeah. Nailed it.
Amos: Same thing, basically. There's a course called fast, fast AI, does a course.
Justus: Oh, I know. Yeah.
Amos: Oh, you took it?
Eric: So what Justus is trying to say is that he's better than the average.
Justus: Yeah! No, I'm saying I'm dumb. It's like data science is hard, like, it was much easier for me to learn web development than it was for to learn data science. Like, way harder. And maybe that's just me cause I'm dumb. Um, which I'm perfectly willing to admit.
Amos: Your hat is on backwards.
Justus: Here, okay. Shift to a philosophical topic. I have become very happy to admit that I'm dumb. And here's the reason why: Because only a dumb person can call other people dumb. Cause real recognize real, dumb recognize dumb. A smart person looks at a dumb person, and you know, I don't know what they think cause I'm not one of them. But like I look at it, the smart person, I'm just confused. I'm like, Oh my gosh, I have no idea. This is an alien to me, right. But, but I see other dumb people. I'm like, “Ah, yeah. One of my people.”
Chris: I like that the mark of becoming really good at this job is realizing all the crap you don't know.
Anna: Yes. Yes.
Chris: And, but at the same time, there is a ton of, there's a ton of resources out there, uh, to do all kinds of like stuff with machine learning. And most of the machine learning is expertise. Like most of machine learning, most of the hard parts, the truly hard parts of machine learning , are like feature selection and that's expertise driven. It's not anything to do with math, right.
Justus: You mean like domain expertise?
Chris: Yeah, exactly. Like, like finding the actual facets and like munging data such that it actually is useful and like verifying results, right? Not in a, like a Facebook like panopticon dystopian kind of way, where like you just gather all the data and you're like, did it move a key metric? And like, that's all you, that's all you look at. Um, and most of the time people implement that stuff really wrong. Like most of the automated systems when it comes to, like, figuring out how the efficacy of your model, that stuff is wrong. There's entire papers written about people building feedback loops into the system where they believed that what they were, the choices they were making in their models were leading them towards the right thing. And what people were actually doing is like logging out, like, right.
Justus: Is this why the Spotify recommend keeps getting worse like year after year?
Chris: Probably, yeah. Because they continue to add more people.
Sundi: It's a feedback loop. There is actually, there is a cap there's a, there's like a four year, like a point in which like you're only in the same recommendations, unless you vastly branch out to different music. It can't, it will just spiral in on itself. And just give you the same response.
Chris: Let's also step back and look at this, right. So who is, what was the best quote unquote like recommendation system out there, right? It was like all these papers that Netflix wrote. Who the hell thinks Netflix's recommendations are good? Not a single human being that I know and talk to on a daily basis. And yet, that's like the cutting-edge paper that everybody's replicating. And it's like, yeah, because you weaponized it in TensorFlow or you weaponized it in Numb PI or whatever, like, or PI Torch or one of these things. And like you just, you just applied it. That's not expertise. Or that's not, that's not actually being good at data science, right. That's doing what someone else did and hoping that your results are okay.
Justus: Yeah. I will say this. First of all, I love that you used the word weaponize. Cause it's the same word that I, like, I mean exactly in the same context too, which is like it's in software engineering, people say, okay, you need to learn the fundamentals to be really good at software engineering. But I also say you need to learn the abstraction. You need to be able to weaponize abstractions, even if you don't understand what they're doing underneath. And I think that the same is probably true for data science. Um, and I think is in agreement with what you guys are saying that you don't, a lot of people don't understand what's going on under the hood and they're using it and they're in there and things were happening. Um, and um, you know, I'm just glad that, um, that I got to, uh, play the devil's advocate there for a minute. And uh, you guys all disagreed with me, which, uh, verifies that I'm probably, I'm probably wrong here.
Amos: So you said you're dumb. So not my fault.
Chris: I don't disagree with your, I don't disagree with the part of the premise that is like, if I, if I I'm, I'm now going to start, let's say this, I'm going to tell you what I'm hearing you say, and you can tell me if I'm wrong. What I'm hearing you say is learning the actual math that goes into machine learning requires you to have a certain background and uh, at minimum linear algebra and probably differential equations and probably calculus and probably all these other things. And you need to know that if what you're going to do is invent science from whole cloth.
Chris: And understand like why the things that you're composing together, do the things. My disagree, I don't disagree with that at all. And like , I went through that fast AI course and went through all that other, like I went through Andrew Eng's like old, like Coursera course. Like, and-
Justus: -That was pretty, that was like, if you don't know linear algebra already, that was a kind of a challenging course.
Chris: Yeah, yeah, for sure, right. And for me it was like, “Oh, crap.” Like I have to go like, got to go get a book and remember how to do all this junk and like how this stuff works. Um, totally don't disagree with that. I disagree with the premise that that is a requirement to do data science.
Justus: Mm mm. Okay.
Eric: That's fair. I mean, yeah, that's fair. I would not state that as a premise just because I don't know enough.
Justus: My impression of this, like data scientists generally being like smarter than me is from my interactions with them. And, and, um, they do like most of the people I've met that work in data science do actually understand the math, like way better than I do, right. Which is why I'm looking at 'em, I'm like, “Wow, the bar is really high because these people all know this stuff, you know, linear algebra.”
Anna: But it's all relative, right? Like, it's hard to tell how much, because you have less experience. It's hard to tell how deep sometimes I feel like when someone's talking with someone that you don't have a lot of depth in, and I'm not saying that these people don't have depth in that, but it's also hard to tell how deep the knowledge goes if you don't have any to contrast it with, right. It's like Keathley saying, people can build a log in page and have no idea, and not understand crypto at all, right. And they know, and they may talk about an abstraction, but actually not understand what's happening under the hood, right.
Sundi: I can't-
Chris: I mean-
Sundi: I can't believe, I'm, like 15-year-old me would not believe I'm saying this, but I actually do want to go back and like, try to understand the math. I didn't understand and yeah.
Anna: Same. I am actually.
Justus: Okay, unanimous agreement, we all wish we knew math better.
Amos: There's a really good-
Sundi: -But also like, on that, I wish I hadn't sold my stupid expensive calculus books from college.
Sundi: I just didn't want to move with them anymore.
Anna: I think the underlying point though, that I think is interesting, is like to become, to really develop expertise. And I think it’s less, speaks less about natural, like natural, your natural ability to learn something. But if you get curiosity to go understand the curiosity, to dive a level deeper and to go understand something, not just use the tools at hand, but like understand how they work. Like that's actually how you develop expertise and actually become really good at what you're doing, right. Like not just living like the day-to-day abstractions and that being okay. And like, that's the difference. And I think it speaks less to like a person's aptitude as much as like curiosity and willingness to learn.
Sundi: Yeah. I am like, similarly when I was, I think I told Eric and Justus this, but when I was helping my parents clean out their basement, I found like my first Computer Science 101, like midterm, where I got 51 out of a hundred. And the professor said, if you got a fifty out of a hundred or below, come see me. And I was like, yes, I don't have to go see her. Um, but I was looking at all the questions and I was like, A) I know all the answers now, and B how did they expect anybody to write these brackets with pencil?
Chris: Right?! People are out of their minds when it comes to giving computer science tests. Do you know why they do that, by the way?
Chris: I found this out, that's an ABET thing.
Sundi: Oh. It's an accreditation thing.
Chris: It's an accreditation thing that they're required to, like you're required to submit tests in specific ways, or you can't pass ABET accreditation, which is why computer scientists have to write frigging with pencil. C++.
Justus: So then they-
Amos: -Yeah, they take off a letter grade because you forgot to put a semi-colon at the end or-
Sundi: That's that, that was the part that blows my mind, like, okay, fair. I should have gotten a 51 out of 100 on that test for sure. But not because of the bracket placement. But, um, I mean that really shows-
Eric: -I don't know that I ever had to write out code on a test.
Chris: There are other ways to get-
Eric: But I don't know that I went to accredited schools, I guess.
Chris: So there's, there's other ways to get past the, there's other ways to conform to the accreditation process. But that's like the easiest, like lowest barrier way to do it.
Eric: I remember a lot of, uh, zipping up cha no net net beans projects.
Chris: Oh, sweet Jesus.
Chris: Yeah. Wow. That's a, that's, that's a, that's a type of pain.
Sundi: I actually, I looked really strongly into the accreditation process a while ago just because of like, uh, my involvement with iTripoli. And I was just trying to understand it. I still don't get it. It’s so confusing.
Chris: No, it's the whole thing is madness.
Justus: I think the last time we were on, we talked about how I gamed the financial aid system in college.
Chris: That was on your show, I'm pretty sure.
Sundi: Whoa, yeah.
Chris: And then you cut it all out.
Justus: Did they really?
Chris: You cut it out! I listened to that at the time.
Justus: Ooh. That's probably good. I don't think it was, I don't think it was illegal.
Eric: Rose cuts stuff.
Justus: I have no say. Rose cuts things. And (inaudible)
Eric: Yeah. They cut things out.
Anna: Your show is definitely more on topic than our show.
Chris: You cut out my, my amazing impression of, uh, the Hufflepuffs sitting there watching Harry Potter win, yet again,
Sundi: No, no, no, no, that was at the end.
Anna: Can you do it again please?
Justus: Can you do it real quick?
Chris: You can’t do it just on demand, okay. It's gotta be in the moment.
Amos: Keathley, you can do it on demand.
Chris: Maybe, maybe, maybe you'll get that later.
Justus: Hold on. Hold on. Let's, let's, let's tee you up for it. Hopefully everyone can hear this.
(Drumroll sound effect)
Chris: Oh! Nailed it!
Sundi: Oh man!
Amos: That is the best program. Loopback and Audio Hijack.
Chris: It's really good.
Justus: You guys are so lucky I have not just been littering this audio with nonsense.
Chris: Listen. It’s your show. As we say on this show-
Anna: -It’s your show.
Amos: You got to do what you want.
Chris: We're just here.
Amos: Is it the show?
Anna: It is.
Chris: I think this has been the show. If not we need to start soon.
Anna: It’s been the show for a while now.
Amos: We gotta go.
Chris: I mean, it doesn't feel like the show, right?
Justus: Okay, what was that whole conversation- you guys were having a whole conversation.
Eric: What now?
Sundi: Justus wasn't paying attention to our conversation. So he's now asking the TLDR on it.
Justus: In the Slack channel. So we got a reflect channel.
Anna: You're the only person that I've heard referred to it, who actually say out loud, the TLDR is too long to say.
Sundi: I say it all the time. All the time.
Justus: I do that, or you do that all the time?
Sundi: We both do it. I think
Justus: Sundi and I kind of have like, uh, like we're sharing a spirit animal.
Sundi: Yeah. On the show only.
Amos: In real life. I'm nothing like him.
Sundi: (Laughter). Yeah. thanks for pointing it out.
Justus: It's alright. I won't take offense to that. I mean, yeah. That's probably right, actually.
Chris: Here's the real question: Do you think there's going to be an in-person Elixir conf this year and what talk should I submit?
Amos: The only reason I went and got a shot this morning is for hope that there would be an in-person conference. Otherwise I just stay home.
Chris: I'm hitting my two weeks late next week.
Chris: Then I'm fully vaxxed.
Anna: I'm fully vaxxed too.
Chris: Vaxxed, waxed, and ready to relax.
Anna: I've been hearing that a lot lately.
Amos: Nobody can takes a, a selfie when they're getting a flu vaccine, but there were all, they had a big sign that said I got my vax, like it’s a word bubble and everybody was taking their picture in front of it. And I was like, I'm just going to go sit in my chair. I'm good.
Sundi: Yeah. Oh I hate selfie things. The selfie thing is just driving me nuts.
Justus: I'm just going to brag about, on my podcast. I don't need a selfie.
Chris: I have an audio selfie.
Justus: An intellectual selfie. Yeah.
Eric: Only normies use visual media.
Chris: I am a gentleman of class and intellect. With a podcast.
Amos: I say things like linear algebra.
Chris: Oh my gosh. My wife told somebody the other day that I have a podcast and I almost died.
Justus: This actually ties back in into the conversation were just having, which was like, I think we were coming really close to touching on this thing, which is like, I think that because we're all engineers and we hang out with engineers, and they're sort of like, a, like smart people become engineers, right. And so we only experience like hanging out with smart people, that we have this- and engineers generally, like we'll just learn things very easily and quickly compared to like normal people. And I think that maybe being surrounded by this like category of human being that we get like a false impression of what people are and think that like, oh, anybody can just go learn like linear algebra. Because like everyone I hang out with and know can just pick up and learn linear algebra. But like, like my best friend from childhood, like definitely is not picking up linear algebra, like, sorry, um-
Amos: It’s, it’s, the things that you're trying to pick up. It's because the people that you're around are picking up the same things that you're picking up, which is why it seems easy. Uh, I mean go out and try to learn to do metallurgy. And you'll run into that. There are people that are learning metallurgy that pick that up easily. You may not. Or plumbing.
Anna: Yeah, exactly.
Justus: This is like when PJ was picking up woodworking. And so I went out and bought a bunch of woodworking stuff and just made the ugliest shop table you've ever seen with like wood glue, and like terrible, like pinions sticking out. And like the legs were like at like slightly, a slight angles. It was just terrible.
Amos: You made your own work bench so that you could not work on it ever again.
Chris: There's a lot of machinists that I've worked with in the past who were, who were more, uh, who learned things easier than other engineers that I've worked with. Like, I don't know. I think there is a type of person who is self-motivated. Um, and it so happens to be that there's a lot of prestige that gets attached to the types of things that we learn. As of today, right. Like in, as we sit here today, uh, tax day, well, not this year, but generally tax day. Uh, we can, you know, in a year, in the year 2021, there there's a lot of prestige attributed to like, "You know, calculus. That's amazing. I could never learn calculus!" And it's like, yeah, but I would, I still don't know how to, like, I don't know, cut like armatures on a lave. Like how do you do that? That's amazing.
Sundi: I don't even know what two of those words are.
Eric: To be fair, everyone in the world takes a calculus class before they graduate high school or-
Amos: -Whoa, watch yourself.
Anna: Not everyone.
Justus: I definitely don't have to.
Eric: Okay. Okay. Okay. Most Americans at some point their like public education are introduced to some form of calculus right?
Chris: But the idea of being like, you know-
Justus: Not in Maryland.
Justus: Yeah. At my high school pre-calc was like, that was the minimum that you'd get to if you-
Amos: -I think geometry and it was like Algebra 1, 2 and then geometry, yeah.
Sundi: Geometry was the highest. And then statistics was what every senior took. If you weren't taking like AB like, or sorry, if you weren't taking- AP classes, if you weren't taking AP classes, you could tap out after geometry and get to statistics and that was it.
Justus: Okay. This is coming from a really like, I'm terrible at math, but we were taking geometry in ninth grade.
Chris: Didn't take it.
Sundi: If you were ahead of the curve, I took geometry in ninth grade with like, because all my friends were doing it not because I was like, I want to be ahead of the curve in math.
Eric: But yeah, but like a lot of seniors are taking calc 2.
Chris: Didn't take calculus, barely passed trig, almost failed out of college.
Sundi: Yeah. I took that as a senior. End failed.
Eric: I am, I almost failed trig. Uh, trig was hard. Trig, well, so geometry freshman year, and then we took trig' sophomore year and I remember I hated this trig teacher and I then got her two years in a row.
Amos: Hating a teacher is probably what hurts most people in things that they would, might enjoy and love.
Anna: That's true.
Just: Yeah. Yeah.
Amos: And it’s okay, Chris, I got a 1.2 my first semester, 22 credit hours. So we're in the same boat.
Chris: Man. I look back on that, on that transcript. I'm just like, Ooh. oof Big, big oof.
Justus: Wait, is this college for you?
Sundi: I don't know if I could find my transcript. I hope I don't need it.
Justus: I was a terrible high schooler. And then college, I finally turned it around, so.
Sundi: Oh, I was better in high school. And then in college I was like, meh, my brother was like, not the greatest in high school. And then was like, whoosh, Dean's list every single semester in college. And I was like, okay, opposite siblings.
Eric: Yeah. I think some like some category of person, like this is my category person that just rebels against being told what to do. So if you don't get to decide, you're just going to do everything you can to sabotage it for yourself and everybody else. Versus when you go to college and it's totally on you, you're just, you're going to excel because it's what you want to do. Um, I always, that's how it was for me.
Sundi: For me, I was just so tired. I had worked so hard in high school, every extracurricular, just all over the place. By the time I got to college, I was like, can I, can I rest? No, no, this is not the time for resting. Darn it.
Amos: It was just too social. I just wanted to be around everybody. Somebody walked by my room while I was doing homework and I'd be like, what are you doing? I wanna leave. I'm out.
Chris: Yeah. I think, I think there's also a lot to be said about, like, programming as a thing is a very largely, very-
Anna: Is that a thing?
Chris: No, yeah. Well, I thought about it a lot and I don't think programming is a thing. No, I think part of, part of what makes programming difficult is the majority of programming, uh, and being good at programming is this ability to hold abstract rules in your head and make inferences about the abstract rules, even if you don't know them. So like, it's basically, can you just invent a world in your head where these rules make sense? And if you're good at that, then you have a natural proclivity to be good at programming. And if you are not good at that, uh, you're going to have to learn that because that's a huge part of what makes you a good programmer. And I don't know how to teach that either. And so like I saw it when I was an adjunct teaching CS 101 and being literally unable to, to teach like how variable assignment worked to people. Like, they got to the end of the semester and didn't get that still.
Justus: I was thinking of like, so, the first thing that occurred to me was that this, I was thinking of like composers who make like Bach, Mozart, or whatever and how-
Chris: -I'm familiar.
Justus: -It's very similar. You have to be familiar with abstract concepts and the relationships to one another, and then be able to use that to create something. But the other thing that it kind of occurred to me was that we have such a diversity of types in, in this industry where, for example, I had was really bad at math in high school. And, and then, uh, in college I did not take a single math, like I just, I was like, I'm not taking any math if I can avoid it, but I was always very good with the verbal stuff. And most of the, a lot of the people I meet in tech are kind of the opposite. And it's just interesting that you can kind of come to programming from either angle, either a verbal kind of creation of worlds and sort of logical articulation of things or a mathematical abstract conception of things.
Sundi: Okay, Justus, I guess maybe we are aligned in the, uh, in the spiritual animal in the real world. Cause I was also not a math but a verbal.
Justus: I feel like my spirit animal is, like, your spirit animal's probably like adorable and fluffy. And my spirit animal is probably like a mangy wolf hound, like howling at the moon.
Chris: But I think the important thing is, like, I don't think that those things are mutually exclusive. And I also don't think that, I think the way math is taught is it, math is taught in like the worst ways possible.
Chris: Math is taught. Basically, it's like-
Justus: -Oh, it’s about numbers.-
Chris: -Hey, what if we taught people music? But, um, for five to 10 years, we only let people, um, read music and not play it themselves. And then when we did let them play it, we only gave them a kazoo. It’s like that's how the math is taught.
Anna: Math and the sciences. Like my parents went to school in Russia, right. And since they start, they start integrating like all of the sciences from the very beginning, right? Because at the same time you're learning something in math, you're learning something that relates in physics. You're learning something that relates in chemistry, right. And it becomes a lot more real world. We teach a lot of these things in a vacuum. And so it makes it a lot harder for folks to grasp, like grasp understanding, because they're only seeing it in a silo and they're not seeing the application of it in the real world until much later.
Justus: Science in particular is infuriating because the way they teach science, it's like, here's a bunch of facts that were figured out by gods. And real science is iteratively experimenting on the world to reduce uncertainty about material reality. And they don't teach that at all. Like a little bit. I was really lucky. I had a couple teachers, my fifth grade teacher brought a set of cow lungs into class.
Chris: Just had those lying around.
Justus: He like got 'em from a butcher. And he took a vacuum cleaner.
Anna: As you do, you know.
Chris and Anna: As you do.
Chris: What do you need these for? And It’s like, no, no, no. I'm going to take these and show them to a bunch of kids.
Justus: Fifth graders. Yeah, yeah, yeah. He sticks a vacuum cleaner, and he blew up the lungs to demonstrate like the capacity of a pair of lungs, right. And how much they expand as they breathe. One time he sent me up to the front class.
Sundi: Why couldn't he have used a balloon?
Justus: Because that's boring.
Chris: Listen, there's reason Justus remembers this.
Justus: He was the best teacher I ever had. He's my fifth grade teacher. And he focused a lot on character. Like he always said to us, things like, "Character is what you do when no one's looking." Um, and I was like that stuck with me, right. He took me one time up the front of the class. He held me upside down by my ankles and had me eat a Ritz cracker to demonstrate that your esophagus is a muscle and it's not gravity as you swallow things. And I was like, “Oh my gosh, this man is a genius. He's a genius.” And so now my entire pedagogy, actually normally on Thursday mornings, my nieces and nephews come over and we do like homeschool at my house and, uh, now that I think about it-
Eric: So you hold them upside down-
Chris: -So you hold them upside down and give them Ritz crackers.
Justus: My entire pedagogy is like, "How do I do stuff like that?"
Anna: Oh, man. Alright y'all.
Chris: So I went and grabbed a cow lung.
Amos: I don't ever want to see your character, by the way.
Anna: Yeah. This has been super fun. You all should continue. I have to jump.
Chris: Bye Anna.
Anna: But thanks for being on the show.
Justus: Anna, thank you for being having us on the show.
Sundi: Thank you for being on our show.
Amos: Thanks Anna, have a good day.
Anna: See y'all.
Chris: Don't forget to upload your audio.
Amos: I'll tell her, I’ll tell her.
Justus: As a total tangent. We, you brought up kazoos uh, Chris, um, the, my one-year-old just figured out how to do kazoos and it is, he it's like the greatest thing for him.
Chris: Oh yeah, absolutely.
Justus: Especially when he he's like kazooing and then he like notices that you're watching, and they start to smile and then he can't kazoo anymore.
Chris: But I will, I'll just throw in there too, I think that the notion that, um, you know, you can't learn some of this stuff and you can't learn like how to hold an abstract set of rules in your head and you can't learn like some of these innate things that some people just get is totally false. Like I, as a person who was sucked at math and was not good at math, I have gone back and done a lot of remedial math. And like, I come out of it with a better understanding. I have used math in my goddamn job and it worked.
Justus: I know, I know, but here's the danger, Chris. I think the danger of this, which I th- like, oh man, I can't think of a good corollary. Okay. So the danger of this is that in assuming that everybody can, we, we make decisions based on that assumption. And I mean, like I hang out with a very mixed group of people, like in real life. And I just don't actually see that most adults, I think it's like some subset of adults do and some subset of adults don't and then like the vast majority is probably in the middle, you know, but I feel like it's extremely dang, it's kind of like, um, how do I put it? Like, okay, like we have a lot of moral axioms in our culture, right? Like moral axioms, like murder is always wrong right. Now that only makes sense in the context of our culture, right. It's because we taught this over so many years.
Justus: Like if we get rid of that stuff, like there's going to be some subset of the culture that just doesn't have like an internal locus of control that says, “Oh yeah, murder is wrong,” You know what I mean? Yeah. And so, and the same thing is true, I think, for intellectual pursuits, which is like some subset of the culture just does not have like the mechanical, whatever the facilities are to do this, right. And so like when we make these assumptions about people's sort of base level of intellectual capability or base level of moral intuition, uh, I think we get into this situation where like, oh yeah, like with our groups, you know, yeah, anybody can learn anything. You know, I, if I wanted to go learn rocket science, like whatever, give me like six months in a textbook and we'll do it.
Chris: Oh, you're basically Elon Musk. (impersonating) Oh, I made a, I got lucky one time, making it easier for people to make, uh, payments on the internet. So I should definitely know how to get kids out of a collapsed cave.
Justus: To be fair. He got lucky with the maps, He got lucky with the maps thing first. And then he got lucky with the stupid, uh, payments.
Chris: I was successful at one time in all of my ventures. And that probably means that I'm an expert at this new one.
Amos: We're going to Mars. We're going to Mars.
Chris: Going to Mars.
Justus: I mean, the man lands rockets on boats in the ocean.
Amos: No, the people that worked for him land rockets on boats in the ocean.
Justus: Oh, man, come on.
Amos: You think he's sitting there doing all that, man?
Chris: Wait, I can't tell what side you two are arguing for.
Justus: No, I think the unnecessary, like I saw this yesterday, this guy was like, like, he was like, he's like Bezos, Elon, these virtuous people, the planet, and all they care about is this space crap, and the guy was an economist, right. And I'm like, bro, you're talking about rocket science being a waste of time or money or expenses, whatever. And you're an econ, like I studied economics in college, so I know it's a waste of time. You know what I mean? Like, and you're going to be judging the literal, like literal rocket, like, I don't know. So like Elon and Jeff Bezos both get like, and I don't even like Jeff Bezos. I think Amazon is super wicked and like acquisitive bad for capitalism, whatever. Um, you know, but uh, like I don't know this whole argument that like space travel is not worth it or whatever. Like we wouldn't have hearing aids or Tempur-Pedic mattresses and, and like, oh, like he's not smart. He's like, obviously a genius, like-
Chris: -A fun story. This is a good tangent. I was with my, uh in-laws and uh, we were sitting there and they were watching a football game and I looked at like my, and it was a very extended family, live out in the middle of Kansas, uh, very rural Kansas. And just like, and like the true salt of the earth, right? These are all like nurses and fire, fire people and police officers and like teachers and like, like th th th the true, like beating heart of this world, right. And I was like, I made a joke to my brother-in-law who's woke and lives in New York. And, um, and if you don't know, he lives in New York, he'll tell you within at least like 30 seconds, it's like, he's, he's like-
Eric: -Oh they always do.-
Chris: He's like a Vim user, you know what I mean? Like even if you don't know, you'll find out.
Eric: People who once lived in New York will I tell you about it within 30 seconds. “Oh, I lived in New York.”
Chris: Yeah, well, you know, listen, I know what's going on. I lived in New York.
Eric: Which means you don't.
Chris: Um, so in any case, uh, I looked over and I was like, you know, this entire football game, we could basically fund NASA for at least like the next year. We just like, took all their salaries and I was making a bad joke cause I was feeling awkward. And uh, one of my wife's cousins looked at me. He's like, "Yeah, but what's NASA ever really even done?"
Eric: Yeah. Yep.
Chris: And I was like, yup.
Eric: Got to the moon.
Chris: Not you're right. I have nothing.
Eric: Have you guys ever looked at the Mariner missions?
Amos: Employed 250,000 people for 10 years.
Chris: Nothing. Absolutely. Absolutely nothing. Then later on, they were looking at my other brother-in-law's new phone. He's like, "That's, that's the new iPhone. Huh?" He was like, "Yup." "That's an iPhone 10 iPhone X." I was like, "Yup." And he's like, "Well you think the next one's going to be?" And I was like "I don't know," "Maybe the Xa, or it could be the Xb and then it would be the Xc". I was like, "I think it's just gonna be the 11." It was good. It was good. It was a good family reunion.
(Ba-dum- psss sound effects)
Eric: Oh, I love your accent though.
Chris: The heart blood of America right here, like.
Amos: Salt of the earth.
Eric: Salt of the earth, yeah.
Chris: And it really is really is the truth.
Eric: And we need all types, all types.
Chris: I'm just recommending that if you don't believe that if you, if you are like, if you didn't get math, it may not be because you don't get math. It may be because you like, you know, you were given a kazoo. So like try it out, try to get this.
Justus: So this I will also agree with, um, I think you're right, that the education is terrible and that most people can get math. And I think that, I'm just saying we, like, the same way we're really good at accommodating the good outliers in our society. We need to be also as good at accommodating the, the not as good outliers, right. Like, so yeah. But I agree that probably the vast majority of people can figure this stuff out and it's just an educational problem. I'm happy to blame it on the education system, for sure.
Sundi: I'm not surprised.
Eric: Some amount of it is like people putting in their own blockers of like, “Oh, I just can't do math, so I'm not even going to try.”
Chris: Yes. Right.
Sundi: We learn that.
Chris: Cause that's the thing I want to, I want to tear down, it's like, don't put up your own walls and go, like, I can't do this because like, I, you know, like there's a huge, let's, let's be clear, there's a huge heaping five tablespoons of privilege that goes into saying stuff like that. Like, you know, just go out and do it. And it's like, “Oh yeah. Cause you just have time and money to be able to go do that. Of course you can go do that,” right.
Sundi: And I feel like this is actually like how we started talking about math more often, or at least this was the turning point for me. When I started talking about math, again was like on the Randall Thomas episode of the, uh, Elixir Wizards podcast. We were talking Randall, and uh, when I took, uh, when I took the Bruce class the, uh, Groxio class, uh, about LiveView and it was with Randall Thomas and I said to Randall at some point, "Oh, I'm just bad at math." He was like, " No, you're wrong. Nobody's bad at math." And he like, went on this tangent. And I was like, "Oh my gosh, I didn't, what did I say? What did I do?" And then we brought them on the-
Chris: -See, that's what I'm arguing. That nobody's bad at math.-
Sundi: -And then we brought him onto the show and Eric and Justice made me ask him what he meant.
Amos: I totally agree with Randall.
Sundi: Um, yeah. And now, now I've come around to that way of thinking. This has been like, “Oh, it's been a year. Oh my gosh. It’s been an actual year.”
Amos: There's a book, I linked it in, in our Slack channel, A Mind for Numbers by Barbara Oakley where-
Justus: -Oh, I love that one.-
Amos: And she was failing and at math and ended up, she's a math professor now and it's pretty amazing. And she talks about how to, how to learn all that. And I think it's just that we often have teachers that have one approach, maybe two, and there's probably 50 different approaches. And maybe you just haven't found yours or you were preoccupied with something else at the time that you tried to learn it. So,
Eric: Oh man, that, for some reason that just gave me a flashback to calculus where in high school, our teacher is trying to teach us, uh, was an integration where it, uh, it's a slide of the slope.
Eric: Or the, the S this, yeah. The slope of the log, whatever the I've, I've forgotten.
Eric: But anyways, he was the way he taught us was he was up on the, he was like, I'm cleaning my gutters. And then the, the ladder slips and I'm starting to fall. And I go point to my daughter and go, "Quick, go get the calculator to see the, my rate of descent!"
Amos: It's one of those high-quality math jokes right there.
Chris: Top drawer math jokes.
Justus: That was really good.
Amos: Well, we've been on here an hour.
Sundi: I'm not surprised that this, yeah. I was about to say, I'm not surprised that when, uh, all of us get together that we somehow produced the math-a-sode, but, uh, here we are.
Justus: So tell us how this went. Was this like, you know, what you were expecting? Better than you hoped for?
Chris: This is exactly what I was expecting. We just talk about stuff. And sometimes we talk about Elixir.
Sundi: There was Elixir in the middle.
Chris: I had a bunch of other topics, I had a bunch of Elixir topics queued up. We just never got over to. Like umbrellas, umbrellas are garbage, still are garbage, still not a fan. My opinion on umbrellas has not changed since day one. And, uh, and the most recent umbrella I'm working on is just as bad as the first one I worked on. So I appreciate Wotjek, I, I know you've worked with a ton of time on this and it's not any shade against you. I just don't like it when people do quote unquote domain design in umbrella apps. It is a bad idea.
Sundi: I don't think that we-
Justus: -Can we say something nice about LiveView?
Chris: It's cool. It exists.
Justus: It exists.
Chris: That's my, that's my take so far. Um, and also contexts. I just think contexts are probably not what you want
Sundi: Um- that's a whole lotta- you know- are we at the end of the show?
Eric: But I got to know at least like, what's that thing Justus always says, the TLDR.
Amos: You want to know the context of the contexts?
Eric: I just want the TLDR, the too long, didn't read whatever. Yeah. Okay.
Chris: Fair enough. I'm gonna, I'm gonna, I'm gonna take it and turn it. Uh, what's the goal of design?
Justus: What's the goal of design? To solve.
Chris: Software design, obviously, right? Like in this context, like what's the goal of software in general. What's the intent?
Justus: To come up with a solution that doesn't suck.
Chris: Okay, sure, sure.
Amos: Okay, that's not bad, I would say to come up with a solution that reduces complexity and is able to change with the changing requirements of the world around you.
Chris: Okay. That's basically how I would describe it as well. That would, that's how I would describe the goal, but there's a lot of other weasel words in there and you know how I am Amos, you know how I am about these weasel words, that programmers start, you throwing around like refactor when they just mean break stuff. And like, you know, programmers in general.
Justus: Is that what refactor means?
Chris: Programmers in general use.
Eric: Is this where we're supposed to bring up business logic?
Justus: We need a Chris Keathley lexicon. We just need like, like some kind of mapping between the words everyone else uses. And what is in-
Amos: Don't worry, Keathley, Justus is gonna let you finish.
Chris: Like, if I'm, if I'm gonna, if I'm gonna, if I'm going to like change a piece of code, I don't say refactor. I say what I'm going to actually do. Like, I need to improve performance of this. So I'm going to take this naive linked list and turn it into a, uh, map with index keys.
Justus: So if I say I need to pull a bunch of business logic out of the controller and put it into the model, is that adequate?
Chris: Uh, I mean, I would, listen, I'm not gonna correct you, right. I don't correct anybody else for using the term refactor. Like, I hear them all use it and I don't go like, "Well, actually it's not it wasn't what are you actually trying to do?" You know, like I don't get all uppity with people. I just don't personally use these words. So I would not correct you. Like I would, I know what you're talking about and I know what your intent is, and so I'm not going to like tell you you're doing it wrong, but I don't personally use those words.
Justus: But wait, so then contexts.
Sundi: Justus, I'm telling you, there's a whole episode to be had on just this conversation. There's no way-
Justus: -Yeah. But just give me, like, what does that even mean?
Amos: Encapsulating logic into a actual, probably non reusable piece that, uh, you probably care about what's happening in there at a different level of what you're putting it anyway. And most, uh, people using contexts that I run to, you have a context that links directly to a database table. So you're just, you're, it’s not really a context.
Chris: Or you have a one-to-one mapping of like controller actions to calls into your context, right? Like that gets pretty close. They present very non reusable pieces of, uh, nonreusable modules. And the goal of design is to, first of all, understand that requirements are never fixed. Your software is never a fixed a fixed point. There's nothing static about it. And that goes for everything. By the way, it's the code itself, the team you're on the requirements, the feature set, the infrastructure it's running in, all that stuff. Like nothing about the software that, in general, we work on is fixed. If you're building a compiler or a theorem solver, probably fixed, but we don't work in those sorts of things in general, right. And I would go so far, I would hazard a guess that the majority people listening to this, all 10 of you, are not working in a system wherein you control everything forever. You control something for a single point in time and you need to maintain software such that it can live for an extended period of time in general.
Justus: Can I, can I ask you a question? And this is a very specific question, but like what do you have an open source project that you like to use as an example of like good Elixir design?
Sundi: Oh, wait, don't tell us, Keathley, tell us on a talk you were going to-
Justus: -No, tell me now! This is a genuine question I want to know now.
Chris: I, uh,
Sundi: Keathley is gonna do a talk on this
Justus: I want to know now. And this is an open question, by the way. I also want Amos, and then Amos can you also answer the question so that I can actually have questions answered, like in real time, that's the whole point of a podcast, right?
Amos: We're at an hour and eight minutes and uh.
Chris: Everyone's stopped listening at this point. So let's just talk.
Amos: I've had two glasses of water and I'm ready to roll.
Eric: I think the only thing missing to make this an Outlaws episode is what did your, uh, daughters have for breakfast, Chris?
Chris: Oh, uh oatmeal.
Justus: Oh, and uh, what does, uh, Amos's shirt say?
Amos: Test all the things.
Sundi: I do real quick though have to tell you, Amos, do you know what the theme of our next season is on the Elixir Wizards podcast?
Amos: Of your next season? I have no ide-
Chris: Contexts are awesome.
Sundi: There's a reason I'm asking Amos, specifically testing.
Amos: Is this, uh, have to do with business owners?
Justus: (snorts) Catholicism!
Sundi: It is, we are going to be talking about BEAM magic. So a lot about the BEAM, but a lot about magic inspired by a tweet by one and only Amos. We want to talk about magic.
Justus: Really? It was inspired by an Amos tweet? Was it a Famous Amos tweet?
Justus: All right. Well, Chris, you can DM me with your favorite GitHub repo.
Chris: Alright well let’s stop the show, then we'll go talk about it. We'll keep the people and list and then, you know, we'll, we'll figure it out.
Justus: Rock and roll. Great show, everybody.
Chris: This was good.
Amos: We'll see you all later.