The Nostalgic Nerds Podcast

S1E13 - Please Enter Your Password… Again

Renee Murphy, Marc Massar Season 1 Episode 13

Send us a text

Renee and Marc break down the evolution of authentication in recent times. RSA tokens. The rise of mobile authenticators. The not-so-long, painful reign of SMS codes.
They move through SIM swapping, behavioral biometrics, and the shift toward systems that study how we type, move, and interact. They dig into trust, privacy, and the growing gap between convenience and control.
The conversation pushes toward the future: digital agents, intent-driven identity, and a world where authentication fades into the background.
A fun nostalgic look at how logging in became a moving target and why the next wave feels different.

Join Renee and Marc as they discuss tech topics with a view on their nostalgic pasts in tech that help them understand today's challenges and tomorrow's potential.

email us at nostalgicnerdspodcast@gmail.com

Hey, everyone. Welcome back to the Nostalgic Nerds podcast, the show where we explore the evolution of technology and how it's shaping our lives. I'm your host, Renee, with my co-host, Marc. Yo. And today, we're diving into a topic that's all about security and authentication. We're going to trace the journey from the humble RSA secure ID tokens to those little 60 seconds OTP bobs to today's continual behavioral biometrics like keystroke dynamics and gate analysis. It's a fascinating progression that's all about how we prove who we are in the digital world. Think about it. We've gone from having to type in a code from a little fob every time we log into something to be able to authenticate with things like our typing patterns or even the way we walk. It's crazy how security has gotten both smarter and more invisible over the years. So let's take a trip down memory lane and look at how authentication methods have evolved, starting with the classic RSA Secure ID token and moving all the way to the world of behavioral biometrics. All right, Marc, let's start with the 1990s. If you were in the corporate world dealing with sensitive systems, chances are you had one of those little RSA secure ID tokens hanging from your keychain. Okay, first of all, Renee should never have anything hanging from a keychain. She will lose it. I remember that. I'm still like that. And anything separate from that would be like, I just don't, oh, it's so bad for me. So if you had one, they were small, they were simple devices that generated one-time password, OTP, every 60 seconds. You'd type in your username, and then the fob would give you a unique code, which you'd enter to prove you were who you said you were. They were a big deal at the time because it was an extra layer of security. You weren't just using a password. You know, the one you wrote on the bottom of your keyboard. You're not using that anymore. You needed a physical token to authenticate yourself, which made it harder for hackers to just guess your log on details. You had something you owned, like that fob, plus something you knew, like your password. It was the beginning of two-factor authentication. But while it was better than just a password, the RSA Secure ID tokens still had their limitations. You had to carry them around and lose them. Typing that code in every time you wanted to log in, it could get a little annoying, right? And especially if you're, like, if you had your—if IT was nudgy enough to have your screen, like, after five minutes of you not doing something with it, it would close. And then you're, like, oh, got to get a lot of you. So it's a little annoying, right? It's a little annoying. I can't remember. Did we have the tokens at the dot-com? No. No, that was—yeah. Yeah. Yeah. Because it's, like—, We weren't that – we didn't think the world was a bad place in the dot com, right? Yeah, but also, like, you couldn't really log in from the outside either. There was a couple of ways in, but it wasn't terribly easy. So, yeah, I mean – so, okay, though, I want to tell you a little bit about the origins of these things, though, because it's probably counterintuitive. So, guess what? The little tokens for digital security. They were built for physical security, not digital security. And this was, I learned this from an RSA sales rep. Like, not one of the ones from EMC, but like an actual RSA sales rep. The ones that, like, they're like real. Not someone who sold storage. Someone who sold security. Exactly. Sold security. Well, you know, they were different. I love, okay. All right. We all get back to the physical security. But I'll tell you my EMC and my RSA sales. So I was a customer of both, right? As we all were back in the olden days. And when EMC bought that, you know, they basically EMC sales engine consumed, you know, the RSA machine. And it was a completely different culture. You know, my EMC rep, his name was Scott. He was well-coiffed. He drove a nice car. Yeah. You know, it was always the blazer. Smell good. New to the best restaurants. That's right. They're talking about an EMC rep. Just frosted tips, you know. And my RSA rep was, yeah, no. It was, yeah, he had khakis and a blue shirt. And, you know, bad loafers. I would say that's a uniform of a security salesperson. Yeah, yeah. Well, you know, I sometimes resemble that remark. But anyways, so he was telling me the story about physical security, and they created the tokens, or like way back in the original, you know, iterations of these tokens before they were called security and they productized and all that. They built them so that they could put them in different places inside of a building. And a security guard would come walking by, and then they would look at the token, and then they would write down the number. And that was how they knew that the person in the building was actually going around and actually seeing those different things in the building. Yeah, so it was kind of cool. And I actually saw one of the original things that had like a little enclosure in it and have the numbers and stuff, which is kind of a clever, you know, application where, you know, to prove that somebody was actually someplace at a certain time, because it's a time, it's a time connected algorithm. Right. So I just thought that was really, you know, pretty clever. But, you know, annoying. Yes. Every 60 seconds, like you'd put that thing in and you'd fat finger it. And then. You have to wait for a new one. Yeah, like the little thing would be, you know, disappearing because it's a little LCD. It looks like a calculator, you know, and it'd be like, oh, well, can I get it done fast enough before the little timer goes out? Or no, I got to wait. And then you fat finger it. No, I got to wait again. Yeah, I totally hated those things. I think my first secure ID was probably in the late 90s because I was trying to think if we had them at the dot com. I didn't remember if we did or not, but it was probably at the gas company. And that's how we got into everything, right? I mean, it's, you know, I was at Northrop, one of the gas company, we used them that way. You know, I don't, you know, they're not, yeah, annoying. But like at the time it was like, oh, that's cool. You know, that's kind of, it's like cutting edge, right? Being able to log into stuff remotely, securely. That's pretty cool. I had some at First Data. I was managing some of the internet platforms. And this is one of the downfalls of the technology, right? Before we had all of these single sign-on platforms, I had bunches of the little fobs because I got different internet platforms from different companies that we bought and sold and whatever over the years. Of course, they all had their unique. You know, you'd have to keep them around. If I drop them, I'd lose access to the financial interwebs. So that was a lot of fun. But gosh, I think that might have even been my last physical token, was it First Data? It was so long ago. You don't remember it as we move into like the 2000 and 2010s. It was long gone by then. No, no, no. Yeah, it was long gone. Except, and we'll probably talk about this, is that the banks were deploying, not necessarily SecureID. Some of them did deploy security. Some of the banks were deploying, like, early internet access, right, online banking. They were deploying tokens of some sort. HSBC did, Barclays, Lloyds, you know, even some of the Chase stuff for private banking and stuff. So, yeah, yeah, crazy. Why would you deploy those little plastic tokens to millions of customers? Like, I can see it for an IT administrator, but. Like, how desperate you are to say, like, there'll be no fraud, right? And especially if I was your customer, I'd be calling you every other week. You'd be like, Renee, it's $75 if we send you another one. Like, it would be like that all the time. It'd be worse than losing your keys for your, you know, your fancy, like, yeah. It would be. But, okay, so as time went on, security had, this is the point we're making, security had to get more convenient. The world was moving faster and typing in that 60-second code every time was starting to feel a bit clunky. Enter the mobile app-based authenticator codes, like Google Authenticator or Authy. Really, Authy? That seems like... So go ahead, Clippy. Like you weren't thinking hard enough. These apps still generated time-based OTPs, but now they lived on your phone. No more carrying around this little hardware tokens. And believe it or not, you guys, I don't lose my phone that often. Plus, my iPad has on my phone, so it all works out in the end. The mobile app was a game changer. Now you didn't have to worry about losing your token or constantly carrying it with you. It made logging in a lot smoother while still keeping the security of two-factor authentication. And, of course, push notifications made it even easier. Some services would just send you a push notification to approve your login attempt, cutting out the need for typing anything at all. That's on all the stuff that has a credit card for me. I do not. Everything texts me a one-time password because I trust no one. I trust no one. Oh, maybe we'll get to, yeah. Yeah. Okay. Renee has a zero-trust architecture just right here. But text, text is bad. Yeah, because isn't that the thing about, like, the whole thing about that TV show 24 was, like, he would always use Bluetooth and text, and you're like, it's the most insecure thing. Like, if you really care about this, you wouldn't use it. But go ahead. No, it's like, yeah, sim swapping, man, and Signal 7, like, the breaches on, oh, gosh, yeah. Okay, so there's this, like, weird period of time where... IT administrators and tech folk and all that, we've got these tokens, right? And then, but we don't see the first smartphones show up until, right? Right. Kind of almost 2010. Yep. So 2007, 2008, 2009, right? And in fact, Authenticator, the first version of Authenticator is 2010. So there's this like, there's this gap period of time. And then, but everybody did have phones. So you would do this SMS text. Sure. Great. SMS text. But, you know, there's like SMS really only lasts like about 10 years before people are like, don't do that anymore. And I think it's in 2015, maybe. NIST basically says, don't do that. Don't do that anymore. Like, that's really bad. Don't do that anymore. Yeah, early 2000s, this period of time where enterprises had strong authentication internally, whether it was tokens or VPNs or smart cards, and customers were still living in the land of passwords. Banks were the ones we just talked about that, sending out physical tokens. But frankly, having worked in a bank, that was technology that when I got to one of the banks, HSBC, we were ripping that stuff out because there's two... It's too expensive to support. You know, it just was too inconvenient for people to mess with that. But we did start to see soft tokens, desktop apps, Java apps, rotating codes, SMS, and that, you know, SMS bridge. It's cheap, universal, easy to roll out, very simple. I got a text message. I put it in. It wasn't necessarily super time sensitive, right? You got 10 minutes, right? Yeah, exactly. Before time's out. That's fine. But yeah, that golden age, it gets squashed because of SIM swap and the SS7 vulnerabilities. But by 2010, that's when everything starts to switch. So what happened in the SMS, the SIM swap and the SS7 network vulnerabilities? Like what was happening there? Okay, so. I'm going to think that not everyone who listens to this podcast would be like, dude, I remember that. So what was it? Yeah, yeah, yeah. Okay. So it's basically just like, you know, you saw on 24th, you know, what's his name is? Jack. You know, yeah, Jack, whatever, Jack Bauer. He can clone somebody's phone, you know, and magically see everything that they're doing. Which, it's not exactly like that. But Signal 7 was and is vulnerable to interception, basically. Right. So on an S7, you know, the vulnerability, okay, S7 was designed in the 70s. So, you know, this is kind of the first implementation of, I guess you could call it digital switching services, right? Right, on the network. But you could pick it up because it's not encrypted. And so it's like... It's protected or it's secured because it's a closed-loop system. As I say, there's not a lot of people on it. I can't get to it, but that's not the answer. That's not the answer, right. People still break out of jails. It's meant to keep them, but they still break out. Yeah, I got you. All right. Yeah, and then SIM swap is a different type of attack altogether, but it's the same result. Somebody can impersonate you or impersonate other people, and then they send you something, and then you send them something and then they impersonate you. And yeah, so it's like a man in the middle attack, basically. But frictionless is what, you know, what we're going for, like to make it easier and easier. And that's what all CISOs and product people and all of that is trying to get to, make it easier to use and more reliable. But, you know, once you remove that friction, people stop trying to work around the system. And, you know, perhaps the irony is that we keep chasing convenience until we realize the safest thing is to make security invisible. I always say that's the best way to do it. And if you have to watch how I walk in flat shoes, then so be it. Let's fast forward to today, and we're seeing the rise of continuous behavioral biometrics. This is the future of authentication, you guys. Instead of needing to enter a code or password, we now have systems that continually track how we interact with our devices to confirm who we are. It's like security is now invisible, working in the background all the time without us even thinking about it. So think about it. Keystroke dynamics is a great example. It's not just about what you type, but how you type. Do you type faster than the average person? Do you have a unique rhythm to your typing? Machines can track all of that and use it to verify your identity. Even if someone stole your password, they wouldn't be able to type like you, making it way harder to hack. And it doesn't stop there. Gait analysis is another example of behavioral biometrics. Basically, it tracks the way you walk. Every person has a unique gait, whether you walk with a limp or have a particular stride or sway when you move. And now smartphones and seriously, Bigfoot does. And now smartphones and wearables can analyze that we use it for authentication. So instead of needing a code or a physical token, your device could authenticate you just by watching you walk. And the coolest part, it doesn't stop when you log in. Your device can continue to authenticate you throughout the day without interrupting your experience. Yeah. So, okay. I have to say, I have played a lot with behavioral biometrics. And one of my guys once, I was doing this thing with the innovation team at WorldPay. And they were like, oh, well, because we were playing with this whole, you know, tap your phone to pay with your phone kind of thing, right? Using your phone. And we were saying, well, you got to be able to put your pin in and pin on phone, right? It was a big controversy. You know, is a phone secure enough to do pin on phone? Like, well, we'll build this thing to, you know, to do all sorts of, you know, different, you know, protections there and make it more secure and all of that. And then one of the guys said, yeah, we'll do analysis on pin entry. And I was like, okay, that's cool. Show me how it works. And he puts his pin in. And then somebody else says, oh, well, I'll try to use your pin and it doesn't work. And I said, put your pin in. And I watched him. And then I put it in exactly the same way he did. And it got in. But, I mean, isn't that the point, though? Like, that's the point. I'd have to watch you do it and completely. Yeah. Right? And do it. Like, I think if you watched me, because I use a full phone number, including the number one. Now, if you watched me do that, you'd be like, because I use three fingers. I don't use one. I'm not doing this. I'm doing this as I do it. So it would be hard for you to figure out why uncertain numbers happen so fast, why other ones are slower, because I'm switching fingers, right? And so, yeah, but think about that, though. Now the hacker has to kind of be sitting next to you, right? So it's like, you know, the call's coming from inside the house. Pins, I think, because, like, pins are just really crappy passwords. There you go. It's just, you know, right, because they're only numbers, and there's four digits, and you can't, you know, some people think, oh, well, there's 10,000 combinations, right? They're acceptable pins. No, there aren't, because there's a whole bunch of pin combinations that are not acceptable. Hmm. You know, so 9999, and 0, 0, 0, 0, and 111. Right, 1, 2, 3, 4, like when an idiot would have on their luggage. Yeah. Exactly. What kind of idiot puts that on their luggage? Their luggage, right? Yeah. Oh, no, I have to go change my luggage. Yeah, the whole joke. Yeah. But you're true. There is a bunch of ones you can't do. So at least it takes a whole bunch off the table. It's a very bad. It's a very bad. I'm glad we're moving away from pins as a whole in the industry. But, you know, one thing about behavioral biometrics that people sometimes talk about is the collection of the data itself. And what you can do with that information, like what's the privacy concerns and, you know, all of that. And, you know, it came up early on when when PST2 came out, regulations, GDPR, you know, what's the GDPR implication of collecting that information and that kind of thing. And regulators kind of took a, let's say, more open stance in the EU that, you know, normally you wouldn't see the EU folks. They'd be like, power to the people, you know, thou shalt not collect any personally identifiable information. But in the EU, you can actually collect that information as long as you disclose that you're going to collect that information and you're using it for that kind of purpose. What you say you're going to use it for. You're not going to use it to notice that my gait has changed. I'm favoring my left foot. Definitely something wrong with my right. So you're not going to cover me for orthopedic surgery. Like that's what happens with that data. You can really weaponize that against somebody if you had enough of it. So I always worry about that. Yeah, it turns into less of like a privacy problem and more of a consent management problem, you know. But I know like, so, gosh, I can't even tell you how many different stupid applications we had at HSBC to do all this background authentication, essentially. Every single action you took was measured by a different application to make sure that it was, you know, you, and you weren't doing something that you shouldn't have done. And then if you did do something that, you know, was a little bit riskier, then it would, you know, prompt and, you know, say, well, well, I'm not sure that that's you. Can you give me some other piece of information? Or it would collect another, you know, bit of sensitivity, you know, and then you'd build a risk score around that. So things like when you wanted to set up a new payee or move 10,000 pounds or, you know, something like that. So it was always, you know, it was always watching. But, you know, as long as you were transparent about the use case and managed that consent, then, you know, people were okay with that. And, you know, from a risk perspective, You know, you kind of have to look at it and say, well, maybe you don't get to use the app if you don't consent to that information because the attack vectors are so many, you know, and the threat is so high that. Like, I can't let you use the mobile app. Sorry. You know, you're going to have to use online banking or you're going to have to go to the branch or something because it's just so much risk, you know. And in some jurisdictions, you know, if there's a breach, then the bank is on the hook, not the consumer. And sometimes it's the consumer on the hook and not the bank and, you know, various, you know, situations. And you want to protect your customers as much as you want to protect yourself in those situations. So maybe not let them use. Yeah, your reputation is on the line too, right? Like it's not, I think they would say it's not my fault you don't have better systems. And then they would turn around and say it's not my fault you didn't have multi-factor authentication. But what if the person you're saying that to is a 92-year-old person? Yeah, yeah. You know what I mean? I mean, there's a lot, right? I'm not concerned about the digital natives, you know, taking up pitchforks. But I am concerned about, you know, the boomers being like, you can't add one more thing. You've already told me I can't show up at a branch and that this phone is how I'm going to do my banking. And now you're going to tell me I need to authenticate it 19 times? Like, seriously, it's not fair. Yeah, yeah, yeah. Yeah. No, I mean, that's the always-on idea, basically. Yeah. So this idea of continuous authentication, it's always running in the background, monitoring your behavior, making sure that the person interacting with the device is actually you, whether you're typing, walking, or even holding your phone in a certain way. You remember when Steve Jobs was still alive and he was like, you know, you got to hold the phone this way? Because remember the antenna? Right. It was like everybody's behavioral marker for that one wouldn't work. Yeah, because we all were holding it up right and making sure we didn't have it. You guys have no idea how bad the first iPhone was. It's a miracle that we are where we are today. That's all I'm going to say. And I am really critical of how much Apple, like, innovates. But honest to God, like, we don't have an antenna to worry about anymore. So that's actually a good thing. Okay, so what's next? We're already seeing these continuous biometrics making their way into smartphones, laptops. Laptops. Like, I can authenticate or I can stick my finger on a key. And that's going to authenticate for me, right? And even cars. The future of authentication is headed toward frictionless security where we don't even need to think about logging in. Our devices will simply open. That's the dream, right? Imagine a world where you never have to type in a password or worry about remembering codes. I want to live in that world. You just walk into a room and the door unlocks for you because it knows it's you. Or you sit down at your desk and your computer automatically logs you in based on how you hold your mouse or keyboard. And as we see the rise in AI and machine learning, these systems will only get smarter. They'll adapt to changes in your behavior, making sure that even if you have an off day, your authentication still works smoothly. It's like having a personal bodyguard for your data, always watching, always protecting. And I'll give my phone this. It never gets confused. I don't look at it and it's like, who the hell are you? Like that never, ever happens. So maybe that's a good thing, right? I find as creepy as that, as creepy as facial recognition technology makes me, I'm glad it's on my phone. I'll just, yeah. Yeah. Well. Boy, I mean, the iPad and the iPhone and the, you know, the Google phones and all that stuff, they automatically unlock for you now. Like, which is, you're right, it's cool. I like that. And I'm glad that it's device-bound, not, you know, cloud-bound. Right. Right. Yeah. And it's held in a secure element on the device, not sent up to the evil. To teach an algorithm how to, you know. Yeah, yeah, yeah. yeah which how to find chubby white girls with dark hair because it's all i'd be telling you right like i don't i don't know why you care like i honestly don't you can't train models like that because so i don't i don't know yeah yeah yeah i i think you know that's definitely that's definitely a future or or a scenario that i think is right is good it's it's a vision of the future i think it's it's a you know it's progress but you know the one in which you know your device uses ai and, that to make it safer for you, right? I think that's all good. But I think we're kind of looking at something that's even potentially bigger. You know, authentication, as we talked about, is proving, you know, you are Renee and I am Marc and it's actually, you know, who we say we are. But what happens when it's not me logging in anymore and it's not you logging in anymore, it's your agent, right? Your digital twin. It's that little bit of software that's acting on your behalf doing whatever it is going to be doing. And think about this like. You know, my app talks to some other app, you know, to get scores. And this is how these banking apps, you know, work today. So that's an application that's on my device that's talking to another application that does something that then builds a score. Like, I'm not in that interaction at all, right? It's acting on my behalf. But then when we start talking about this whole agentic, you know, thing that, you know, either is going to happen tomorrow or never happen. Or never at all, yeah. Yeah, who knows? But if these applications are acting on our behalf, then they have to be authenticated. And that authentication is about intent as much as it is identity. And I think that agency itself is the thing that's going to have to be authenticated. It's not, you know, it's not enough to trust the user, right? Trust is not a control. Trust is not a control. Trust but verify. And we'll have to trust that digital representative, right? If they're acting, and this is where I think it's going to get really weird, but I definitely see this happening where I tell my agent to go do something. And it interacts not just with another agent, but it has to interact with a human somewhere. You know, so how does the human then know that that agent is acting on my behalf, right? Machine-to-machine authentication is easy. Right. It's, you know, there's protocols for this. It's been done. It's, you know, that's a clear scenario. But then when it's, a human that then has to abstract their identity to an agent, and that agent then has to authenticate to a human. How does that happen? That's like reverse authentication to what we have today, where humans authenticate themselves to the machine. I hadn't thought about that. Like, what if you're an airline and we all have travel agents now, and it's literally agentic AI. Like, I don't have a travel agent. I have a digital twin of somebody who does that stuff and it knows my preferences. It knows when I'm leaving. It looks at my calendar. It knows where I'm supposed to be. And it says, let me go take care of this. And it books all my travel, my hotels, my expenses, whatever it is. It does it all on my behalf. And other than having that agent and tell it where I want to be, it has nothing to do with me except that it's now working with an airline, right? And it's talking back and forth with those systems. If something has a hiccup and a human has to reach back out to me, you're right. How do they do that? You're Right. It's the reverse of what we're trying to do. I'm trying to abstract myself from that layer and they need to reconnect it. Right. Yeah. Yeah. It's a weird like and I don't think we've come up with, you know, a lot of these use cases, but I think the travel one is a good one because. You know, something does happen. My flight gets canceled, right? Well, yeah, there's an automated system process to do all of that, sure. But, you know, how many times have you talked to an agent, a person, to go and rebook things? Because, I mean, look at, you know, the Southwest Airlines, you know, issue, right? That application that they were using, it couldn't do the work that it needed to do. So it required actual humans to do that, right? And to recalculate all the bookings and the routes and all that stuff, right? So there's definitely scenarios where a human has to interact with another, you know, computing system and the authentication path is reversed. And so we have to come up with some human to agent to agent to human mutual authentication mechanism. And I think it's definitely something that will happen at some point. But AI is definitely going to be a part of all of this stuff, right? And we talk about that all the time. AI, woo! You know? But I think it's, you know, who's authenticating to whom? I think that's going to change. And that's continuous contextual behavioral data is still going to matter. But it might describe the pattern of agency, not just the pattern of behavior. So it's not just proving who you are. It's proving who's acting for you on your behalf. Yeah, I think I think that's the next phase of trust online. You know, we've spent decades authenticating people. Okay, I go back to, you know, even before decades, you know, invented passwords, the Romans. 2,000-year-old technology. So there's, like, whisper in my ear and I'll let you in the building. Was it that kind of thing? Yeah, pretty much. Yeah, yeah. It's passphrase, passwords, passphrases. Guards, you know, yeah. Do you know how we used to guard the missile silos in the deserts of Arizona? Oh, yeah. Oh, boy. Right? Like, you would have to, first of all, you'd have to get past the snakes. If you got past the snakes, right, they would buzz you. You'd have a code. And you would say that at the door, and they would let you in. And then once you're in the man trap, you would take that code that you had on a piece of paper, light it on fire and drop it in there because no one could know that's what it was. And then you had to punch in another code, which they would punch in a code back. You would punch in one more time and then that door would open. Right. Like it was so and all of it was manual. There was no, you know, biometric scanning of your eyeball. No, you had a piece of paper that had a password on it and you lit it on fire after you used it. That's how far we've come. Right. That's crazy. But even though we've come so far, technology has automated a lot of that. The underlying use case is still required. Who are you? Are you allowed to do what you say you want to do? That proof of who you are. Decades authenticating people, then devices, then behavior. Now it's going to be about intent and whether it comes from a person, a bot, or some system that's spending money on your behalf. Oh, wow. It's getting so complicated. So from the days of RSA Secure ID tokens, the future of continuous behavioral biometrics, we've come a long way in how we secure our digital lives. It's amazing to think that one day we'll be able to authenticate ourselves just by being ourselves. No passwords, no tokens, just the way we move, type, and interact. That's crazy. But you know what? If I've learned anything, humans are incredibly predictable. So there you are. Thanks for joining us today on the Nostalgic Nerds podcast. Be sure to subscribe and share with your friends. Let us know your thoughts on the future of authentication. What's the coolest thing you think will happen next? And hit us up on social media. Email us at nostalgicnerds at, Is it the whole podcast? Yes, nostalgicnerdspodcast at gmail.com. Good luck with that. And email Marc. He reads them. So thanks, nerds, for tuning in. We'll see you next time. Catch you later, nerds.