The Nostalgic Nerds Podcast
The Nostalgic Nerds Podcast, where we take a deep dive into geek culture, tech evolution, and the impact of the past on today’s digital world.
The Nostalgic Nerds Podcast
S2E3 - The Internet We Lost
Thanks for listening. Please rate and share with friends and family. It really helps!
In this episode of The Nostalgic Nerds Podcast, Marc and Renee reflect on the internet we lost and the early promise of a global commons that rewarded curiosity, wit, and genuine insight.
They explore how identity, scale, and memory shaped early online communities, why empathy felt possible in smaller spaces, and how engagement-driven incentives quietly rewired behaviour as the internet grew.
What emerged wasn’t the place many of us hoped for, but an environment optimised for attention, outrage, and monetised conflict. A place that’s now deeply resistant to change.
This isn’t nostalgia for dial-up speeds or blinking cursors (there's some of that). It’s a reckoning with what we were trying to build, what we actually built, and what it might take to do better next time.
Featuring the episode companion song "GeoCities (You Let Me Be Ugly)"
Join Renee and Marc as they discuss tech topics with a view on their nostalgic pasts in tech that help them understand today's challenges and tomorrow's potential.
email us at nostalgicnerdspodcast@gmail.com
Come visit us at https://www.nostalgicnerdspodcast.com/episodes or wherever you get your podcasts.
I still remember the exact physical posture of getting online in the 90s. Lean forward, finger hovering over the mouse, listening, because you had to listen to that dial-up sound. Remember that? It wasn't just noise. It was a warning. It told you that you were about to leave the physical world for a little while and enter a different place. And it felt like a place, not a service, not a product, a place you visited on purpose. And once you were there, everything was slower. Text loaded line by line. And I'm not talking buffering. I'm like literally loading line by line. Images felt like events. You didn't scroll endlessly because there wasn't anywhere to scroll to. You just waited. You read. You thought. And that slowness shaped how people behaved in ways we didn't understand at the time.
Marc:Yeah. When you said line by line, it like reminded me of like Usenet forums. And I didn't use Usenet very much, but I do remember that way back in the day. And, you know, especially when I was doing research on when I was a sophomore or junior, senior at university and i needed to get images of certain like manuscripts and it would load, and it would like well like one line and then you wait like one line can't.
Renee:Wait to see what it looks
Marc:Like i know exactly you this all this anticipation right but those those usenet forums they were this kind of long and rambling threads you know that people would you know drop in you know information to and then people would respond and then they would quote each other and respond again and oh you know i actually i actually downloaded a usenet client the just just today to mess with it and it's still just as tedious and pedantic as it was then but it was incredibly civil in comparison to what we do now because if you were kind of. Rude or sloppy or cruel people remembered it's like the the beginning of the well actually you know meme right and yeah yeah and it's it's funny i i want to jump into identity in a little bit here but you know your handle your persona your your username right it meant something but it was this digital persona your reputation was attached to that inside that community And there was that kind of accountability through memory, not necessarily, you know, moderation or enforcement and that, you know, those things just really turn out to be very different. Hey folks, welcome to another episode of the Nostalgia Nerds podcast.
Renee:Today we're going to talk about the history of the internet. The internet! That's how we met. We met because of the internet.
Marc:Yeah, I guess so. I mean, you're at Ketchum, right?
Renee:Yeah, well, I guess we met before that, but the time that we worked together in the same place, the same time on the same thing, the internet brought us together. So I'm nostalgic for the internet. I always think, when I think back to it, I think, aw. Anyway, we want to talk about the internet we were promised.
Marc:And the one that actually got accidentally built instead.
Renee:The thing people forget is how explicitly optimistic the early internet narrative was. This wasn't accidental hope or vague cultural enthusiasm. It was stated. Academics wrote about it. Journalists framed it that way. Technologists evangelized it. What was Google's thing? Their mission statement was, be nice.
Marc:Yeah, don't be evil.
Renee:Don't be evil. That's all it was. How did they stray so far from that? The belief that that exposure would create empathy, that if people could see one another directly, hear voices without intermediaries, share experiences across geography and class, something fundamental would shift.
Marc:You know, that concept is not new, right? The one that a community creates empathy, right? This is why Neighborhood Watch exists, right? If you know your neighbors, you're less likely to commit crime against them, right? But, you know, that idea of, you know, kind of community and class and that, it, you know, It started out with good intentions, but I remember being at university and do you remember this, the CIA fact book? Do you remember, do you remember seeing that? No.
Renee:So remember when I was in college, I cared about acting. Not really.
Marc:Yeah, I suppose. I mean, I guess like when I was at university, it was this kind of special time right when the number of web servers was starting to climb. Like, you know, the public internet, the public, what we think of as a public internet now, you know, the public web was just starting in 93, you know, October, November. And by the end of 93, you had about 600, 650 web servers that were public in 1993. But before that, you know, you had Telnet and Usenet and, you know, different kind of bulletin boards and things like that. And I remember, you know, the CIA fact book, it was this, but not so much because it was the CIA fact book, but because there was this. Promise of information, right? I had an English degree or have an English degree. And so I was interested in research, you know, and reading things and, you know, material that I could get. And so there was this kind of promise that research and information was available, that it was instantly available instead of in the stacks in the library. And that's where these kinds of tools, at least for me, were really promising, right? You could interact with people that had common interests. And there's this sort of deeper assumption underneath all of that, that the misunderstanding was the real problem, right? The conflict that might arise when you see various things happen didn't have enough information about one another, right? Like if I had some sort of beef or issue with somebody, it was a fundamental misunderstanding because of like a lack of empathy, right? And so if you remove all of these sort of barriers and filters, people would soften, right? You build a connection. Connections reduce conflicts. And visibility leads to accountability. And so, you know, that kind of sense of community naturally flows, you know, from institutions towards individuals just because of access. And, gosh, as a university student, somebody that, you know, didn't have a bunch of money, I believed, you know, and lots of people believe that information should be free, right? It's not, and that's not like, not a metaphor, right? That's a real principle. The idea that once information is available to everyone, people would behave responsibly with it.
Renee:Oh, of course, because people can be trusted.
Marc:Sure, right? You know, if you have context, right, you can compare, you can update. When you say it out loud, though, now, it sounds like we collectively forgot everything we know about incentives and human psychology.
Renee:I think for me, that's it, though, right? That's what got me thinking about the whole thing to begin with. When I think back to where it all started, like Al Gore invented the Internet in the 70s. But it was academia, right? Like academia had a platform that they would talk to each other on, but it was closed. Only they could get in it. Only they could talk with each other. And they're professors and doctors and philosophers. And yes, they think long and hard before they write. They challenge each other respectfully and professionally. I think the idea that people, all of us would behave like that. All of us would behave like them. if we take down all these barriers we can create community like you know we're all going to do this and we're all going to be contextual and we would update our thinking and we would evolve and my god what no no yeah they behave that way they behave that way yeah for sure yeah
Marc:I wonder like how much of that is because of, You know, I think, I think, you know, as I just said, right, incentives, right, human psychology and incentives and the incentives to use the tools for for academia were peer review, research, knowledge, information, sharing, those sorts of things. But that's not that's not the primary use case. Right. Once it leaves academia. Yeah.
Renee:Not even close. It's not even close to what the use case ends up being once it leaves academia. But think about it at that time, though. So we just did it. We just did a whole podcast about how much it used to cost us to dial long distance. This is a way for them to actually work with other people in other countries. So if Australia, the United States and the UK are all working on a science project together, if you did it over the academia network, then you guys weren't you are actually doing all that all the time. And they were doing that back in the 70s, right? Like, that's crazy. Like, they had a really cool thing going. They should have just kept it to themselves,
Marc:Honestly. It's that assumption, right? You know, the early assumption is that access to these academic systems, the processes, procedures that, you know, they were using, the wisdom that sort of gets associated with that would also be produced, right? If you have access to the same sorts of things, then obviously, then it would be used in the same way and it would produce similar wisdom. More information leads to a better decision. If you remove friction and authority, that leads to self-regulation. There's a lot of libertarian ideals in the early internet. And there wasn't really a lot of discussion around incentives. We talk about freedom, talk about rights, talk about privacy. We did talk about the behaviors and what kinds of behaviors would actually get rewarded when you bring scale into the equation.
Renee:But here's the important part. It made sense in context. Gatekeepers really did have flaws. One thing that's easy to forget is that the early Internet didn't rise because people suddenly hated institutions. It rose because trust in them was already cracking. Like today we talk about like we have zero trust. That was starting way back in the 80s. I got news for you. And it just kept coming. Right. We just kept building on it. You know, media was centralized and gatekept by a small number of voices that increasingly felt out of touch. Governments looked slow and bureaucratic at exactly the moment technology was accelerating. Imagine the disconnect there, right? And then corporations were becoming more powerful, less accountable, and expertise itself was starting to feel insulated from lived reality. So this idea that if I can get if I can get there frictionless, you know, with no gates and I am getting farther and farther from the institutions and it's supposed to open up the world to me, it's not supposed to warp up my reality and cause me to have to take social media days off. Right. I think that's the part. Right. Like in context, when we think back to when that was, it makes perfect sense why we thought that was a good idea. Like we weren't being ridiculous. We weren't being pie in the sky. We weren't not thinking about risk. Like in our minds, the risk was you don't stay in the plastic kingdom of Ronald Reagan. You find your way out and you find the real world. Like, and I think that was our mistake. It was never the real world. It was never the real world.
Marc:Yeah. I mean, it's something you said, you know, when you mentioned Reagan, but I don't know we joke about Al Gore and the internet and stuff, but like, okay, you think about like, when did the internet hit the public hard? And, you know, it's in that it's in that 90s period. Right. 93 to 95, 96, you know, and then obviously in the dot com and stuff. But like this is when it starts to have, I would say, that kind of cultural impact. Right. Mid 90s. And who's who's in the White House in mid 90s. Right. It's not. Yeah, it's not. It's not Reagan. It's Clinton. Yeah.
Renee:That's why Al Gore invented the Internet for folks listening. Al Gore invented the Internet.
Marc:I mean, yeah, I didn't even bother going into that, but I mean, this is... Yeah, there's valid reasons why he would have said something along those lines. Not exactly the way he said it, but, you know, whatever. It is what it is. But when the Internet shows up, it's like, you know, hey, it's going to promise this kind of direct access. No filters. You can speak and publish without permission, right? It's correction, right, against kind of the previous decade or so. Not in a chaotic way. It's a way around the institutions that looked compromised or obsolete or disconnected. So that optimism, it just, it wasn't random, right? If you think about culturally in the 90s, there was a lot of positive, positive zeitgeist. I mean, there's still problems, you know, we had, you know, Gulf War, you know, one and then Gulf War II and, you know, leading in that, you know, the late 90s and early 2000s. But, like, you know, if that was, I mean, Gulf War I was, you know, it happened and then it was over, right? And then you had, you know, this kind of boom in the mid-90s, economically, politically, socially. You know it was so so the internet is kind of this response to failures of the previous you know you know a couple of decades or so it was assuming that removing you know these kinds of barriers and things will automatically produce something something better but i i want to talk about one filter though like we're talking about filters i want to talk about identity and anonymity and i that we don't really talk about, you know, or the thing that. We didn't really understand at the time around the internet goes kind of big as identity. The internet actually supports, I want to talk about three different kinds, and only one of them has only ever worked at scale. So, all right, the three types of identity that you'll encounter in the internet, right? So there's one that's like stated identity. It's your real name. It's tied to a real human. But the internet didn't invent that, right? Governments and banks did and and when you talk about that that identity stays locked inside an official system an official context right a very specific use case it doesn't spill back out into the wider internet like that verified identity in most cases we're not talking about you know certain you know everywhere but in most cases my stated physical human presence is not something It's associated with a digital persona. And what the early internet actually built was that abstracted identity. So this is kind of the second scenario. Abstracted identity handles, persona, your user account, your username, a name that meant something inside a community, even if it meant nothing outside of it. You know, like I have a Reddit account. My Reddit account is X, you know, whatever. And that worked as long as the communities were small enough to remember who you were, right? And then there's anonymity. No memory, no continuity, just isolated actions. So the problem isn't this sort of abstracted or anonymous identities exist. The problem is that we scaled them globally without scaling memory, norms, consequence, right? And once that happens, behaviors change. People hide behind these sort of abstracted things. Not because the people changed, right? But because the system stopped caring who, you know, these people actually were.
Renee:But that's it, right? And when you think of, so if you go back to the idea that we were trying to get away from power centers, we were trying to get away from consolidated media, we were trying to get away from consolidated government, but we didn't eliminate the power, we just moved it into faster, less visible systems. Like you said, and decades later, we're still living inside that trade-off. We treated filters as purely oppressive instead of recognizing that maybe, just maybe, they also created a shared reality. They slowed things down. They established norms. They gave people a common set of reference points. And when you remove them completely, you don't get pure truth. You get a fragmented reality that means different things to different people at different times.
Marc:Yeah. I mean, you think about removing filters, people thought, whether it was identity or control, you think, okay, you remove these things, and that means removing bias. But actually, it removes the referee, the moderation, the behavioral control. And so let's talk about scale, right? And we've talked about scale before, right? When you scale things up, the behaviors don't necessarily change underneath. But what does change is the reach, right? The scope, you know, how fast things happen. So when we scale these sorts of things up, the internet stopped being a collection of communities and became this global pressure system, right? There's no limits, small forums, mailing lists, early blogs, right? Memory mattered. But, you know, when you take that limitation away, then those things are less relevant. Those things don't matter as much. People remembered what you said last week, right, in these smaller contexts. And remember, if you were thoughtful or careless, then that memory creates consequence and shapes behavior. But the scale erodes that. Memory disappears. You can say something reckless. You can watch it explode and then vanish into the noise, right? Consequence thinned out. Speed replaced reflection as a primary value. So being first mattered more than being right. There's even, I don't know if you've seen this right, in blogs and comment forums and posts and things. Like, it used to be popular for people literally just to say first as a comment. Like, really, you know, what does that add to the discourse?
Renee:Yeah, because that matters.
Marc:Yeah, right. So being loud, being fast, you know, mattered more than being accurate or being, you know, having any substance. And suddenly those traits that once, you know, kept everything corrected, confidence without nuance, certainty without evidence, emotional intensity starts to get rewarded instead.
Renee:And exposure didn't create empathy the way we thought it would. It created comparison. People weren't learning about one another to understand. They were measuring themselves against curated versions of everybody else. And the connection didn't reduce conflict. It localized it, amplified it, and made it personal. And power didn't disappear. That was the biggest illusion. It reorganized itself. It moved into systems that were faster than governance, harder to see, and less accountable. Algorithms, platforms, incentive structures started shaping outcomes more than any single institution ever had. That's it. Like my relationship with health and human services was not personal. I don't know anybody there. The passport office. I just send my stuff. It comes back. That's my relationship with the institution, right? It wasn't like Facebook, right? It's not like my relationship with that institution now. It's completely different. And unlike traditional power, this version didn't have a clear address. You couldn't vote it out. You couldn't subpoena it easily. You couldn't often see it working. That's weird.
Marc:Like, you think about it, like, take the internet, you know, the FAMGAs, the whatever the new, you know, acronym is for the big platforms and stuff. But, you know, you take them off the map for a second. If I need a new tire, you know, on my car, I know who to call. Like, there's a phone number. There's a person. You know, there's an interaction. Like, who do you call if there's something wrong with Facebook?
Renee:Good luck. Nothing. Nobody.
Marc:There's nobody. But for some people, that platform controls, maybe not controls, but influences a very large part of their life, you know? And for them to not be able to hold that platform accountable or to have an interaction that pertains to governance or, you know, rights or privacy or anything like that is just like, Like, can you take Facebook to small claims court? I suppose you can, but... Good luck. No. Yeah, good luck.
Renee:They countersue you. And then all of a sudden, you have to get a real attorney. No.
Marc:Yeah. No. I mean, that scale is just...
Renee:I mean, let's call it what it is, though. Let's seriously call it what it is. You are not the customer. You are the product. And so there's no reason to do any of that for you. You're the product.
Marc:Yeah. Totally. And, you know, the behaviors, I want to remind people how quickly the internet got commercial, right? And we'll go into some of the brands and stuff in a minute, but like the very first, you know, as I said, kind of the first set of web servers start showing up to the public servers. There were public web servers before that, but 93, and by the end of the year, you had about 600, 650 or so public web servers. But early in 94, PizzaHut.com shows up, and you could actually buy a pizza on PizzaHut.com. And, you know, that's very early in the public internet, right? The very, the first big scale advertising platform shows up in 96, DoubleClick.net, right? So you're talking about just a few short years from the time that there was nothing for the public to there's a full-blown advertising platform that starts to shift behavior and control.
Renee:Here's what's crazy about that. We go from advertising in the yellow pages. So you had whole advertising companies that did yellow pages and white pages advertising. And all of a sudden they're faced with no one wants to buy there anymore. They all want to buy on the web. So they all had to switch over to how do we turn what we used to do, which was negotiating with all these different white pages and yellow pages companies to get ads in them to turn it into. Now we're going to advertise on the Internet. We're going to do like it completely even changed that advertising thing along with it. And it happened pretty quick. It felt like it was overnight for sure.
Marc:Overnight. Right. I mean, you know the kitchen business, right? The directory business was gone, you know.
Renee:In a couple of years.
Marc:Yeah, years. So, I mean, and we talk about, you know, catalog and, you know, directory and stuff in a previous episode. But, like, just the theory of operation completely shifted, right? You buy advertising as a business in a directory, then it switched to an atomic buying and, you know, selling of eyeball time. Um, and that, as you said, in a few short years to the point where the entire, I would argue the entire like impetus for the internet today is, you know, as one giant advertising platform to get more eyeballs on more contents, you know, and, and structured content. But, you know, this is why the internet doesn't feel neutral anymore. It's directional. It nudges behavior, right, without announcing itself. It rewards outrage over restraint, right? It rewards advertising placement over best product placement, right? And it certainly rewards that kind of behavior over curiosity, performance over reflection, right? And the unsettling part is how quietly it does this. It's not through rules, but through these sort of feedback loops, right? Engagement loops. And those aren't the instincts that we designed in the 90s. We designed it for curiosity, right? We designed it as a way for semi-structured, unstructured, structured data to be shared rapidly and quickly, you know, for experimentation, for dialogue. And what we got instead was, because of scale, optimization, right? And once the system starts optimizing human behavior at scale, it stops being a medium and starts being an environment, right? A mechanism of control.
Renee:This is where the brands enter the story and for me this is like like i first of all like i love brand risk you can't get me not to talk about it mostly because i get to throw brands under the bus but but i this is where brands entered the story whether they wanted to or not because once you You know, shared reality fractured, people started looking for symbols, something concrete, something visible, something emotionally legible that they could argue about, rally around or push against. Brands filled that role almost perfectly, not because they're powerful in a political sense, because they're present and close to us in everyday life. They operate in a public space without public authority. They're private entities, but they're culturally visible in a way governments often aren't anymore. People see them daily. They interact with them constantly. So when institutions feel abstract and unreachable, brands become the most accessible targets for meaning making. And meaning is something the internet is very good at assigning, whether it deserves it or not, right? People project values into brands. This happens all the times, right? Like they call it the values-based customer, a customer who says, you know, I believe in, you know, DEI. And it used to be, I believe in DEI, so I'm going to shop at Target. Target screwed that up in a royal way. But that used to be what it was. The brand was known for, like if you had a 10-year-old who identified as gay, they could come shop there. They could get cute pride stuff for a 10-year-old. Not anymore, but you used to be able to do that. purpose statement, mission language, identity-driven marketing. So every action becomes interpretable. Every outage becomes incompetent. Every ad becomes ideology. Every silence becomes intent. Even neutrality gets read as a choice. The internet doesn't allow brands to simply exist. It forces them to signify, and the brands call it the brand promise. They were forced into this idea that even being quiet meant you picked a side. That's crazy. And like some brands like went all in. Nike. Nike went all in. Remember the Colin Kaepernick stuff? Remember that? Like they just went all in on that. Like done. And at first... They lost some customers and they got some hate on the internet and all that stuff. Ultimately, in the end, that year, because of that ad campaign, they made 25% more in their revenue than they had the year before. Right. So there's a reason why they do that. They're incentivized to do this, only at scale, that becomes a real problem.
Marc:Yeah. I tried to look in to see, like, of the Fortune 500, like, who were some of the first brands that had public-facing websites and who were some of the last brands that had public-facing websites. It's really tricky to track this down because the 500 changes year to year. But, you know, it kind of plays out like you would expect, right? So, 93 and 94, you had almost no Fortune 500 presence at all. Any of the websites are kind of experimental. They're basically a website or a web server that some IT person like one of us would have set up underneath a desk or something like that. It wasn't mission critical. It was hosted on kind of just non-resilient infrastructure. And you know it starts to you start to see in 95 some of the early adopters break cover and and the the reason here not the only reason but you know one of the many reasons is that you know netscape ipos in 1995 and it kind of legitimizes the web and you have this sort of media coverage it starts to explode and and what happens is you start to see fortune 500 companies begin launching these kind of, I wouldn't say brand promise, right? They weren't sophisticated at that level yet. They were more like. This is about us, right? This is the product that we have. This is an investor relation page. These are some product, you know, brochures, right? And, you know, but by 95, still well under half of the Fortune 500 has a site. And what you see is kind of URL defensive strategies by, you know, whatever, Microsoft.com, although Microsoft had a site, you know, whatever, but But you get the point. 96, you have now, by 96, you kind of have majority representation, over half. But it kind of, you don't get like full representation of a website until like 99. So several years, the late 90s, you know.
Renee:That's when we're doing .com. I know. We're doing .com in 99.
Marc:I know. I just thought it was really fascinating, you know, that this, you know, the shift, it kind of sneaks up on organizations, right? It didn't necessarily start out as, you know, something I had to have. But then, you know, it was like, oh, well, maybe this is a tool that I should use. You know, brands, they didn't just gain reach at the time. They didn't reach, you know, through just the Internet. So they have these tools that start to, hey, well, I have to stake my claim almost. I have to, you know, have this kind of symbol of our existence. And then that stops being, you know, just companies that start, you know, putting stuff out and they start functioning. You know, they start providing services. They start, you know, having a persona, right? They have an interaction. Like, you know, you interact with a brand on a different level because you're visiting their virtual site. You know, you're using their virtual services or you're buying something from them. And that's a different kind of, let's say, role assignment that people have. People do assign roles to those companies, villain, hypocrite, ally, sellout, negative ones, but also positive ones. And those roles, they stick with people, even if they have very little to do with how the company actually operates. And brands in this period of time, they end up navigating narratives they don't necessarily write. They respond to expectations they never agreed to. And it carries emotional weight that used to belong to the institutions. And this kind of the Internet sort of bleeds out, you know, their brand identity, their brand structure, their being, and people interact with it. It's like any piece of software, right? I build a piece of software and, you know, you do all this sort of positive testing, positive testing. And then, you know, somebody says, well, it doesn't work. Well, why doesn't it work? Oh, well, I decided to put this in the field and it broke.
Renee:Oh, I used to do that all the time. If you made a web page, I could break it. Like, I should have been QA. I could have broke that crap in no time. And then the developers were like, why did you click there? I'm like, I don't know, dude, but I did,
Marc:Right? But see, like, that's this thing. Like, all of a sudden, you lose control, right, of your identity, of your proposition, right? People are going to use your site or interact with you in ways that you didn't expect. And it changes the way you have to manage that message.
Renee:And that's why brand risk today isn't really about messaging mistakes. It's about being structurally exposed in a system that turns visibility into vulnerability. Brands didn't choose this role. They didn't choose to be a character in a plot. They didn't choose to have a persona as a villain or your rescuer or your hypocrite or whatever it is, right? They didn't choose that role. The internet shows it for them. And this is where reputation stops being soft and starts behaving like infrastructure. When it fails, everything downstream feels it. Legal, compliance, security, operations, markets, regulars, everybody, employees. And it all happens at internet speed. Reputation now moves faster than governance, faster than legal review, faster than executive decision making. And once a narrative takes hold, you're no longer managing facts. You're managing feelings and beliefs, and good luck with that.
Marc:Yeah, yeah, no kidding. You want to talk about some of the brands?
Renee:You know, the one that comes to mind lately is Target. It's still being boycotted. It's still not posting, like, good numbers. And it was totally self-inflicted. You know, they didn't need to get in the middle of anything. And it was funny at the time. So this is after the election, and everybody thinks if we just appease the president, he'll leave us alone, and he won't say bad things, and everything will be fine. And one of the things that Target decided to do was no longer sell pride, no longer, you know, do any of that advertising. We are now neutral to that is what they thought. But they had built the brand on the idea that this was a safe haven for people, for everybody. It didn't matter who you were. So the response to that was, you are a hypocrite. You're a liar. And I can't believe I ever shopped here and I'm never doing it again. And it is still hurting them today. On the other hand, though, Costco said, we want to reflect the communities we're in. We're not changing anything. You can go away now. And their sales went up by 25 percent. Again, like this idea that there's no upside to that. There is upside. There is downside. Right. I mean, but they're not the only brands. Right. I mean, Cargill doesn't have much of a good reputation. Right. You know, DuPont still doesn't have a good reputation. I mean, there are there are things where, you know, you kind of deserve what you get because that's that's your persona. That's what you decided to be. Here's what I will say, though. You know, here, Monsanto. Monsanto does not have like i don't think like like monsanto is warm and fuzzy and it's like this little doll that i'd want to hug however monsanto if you go look at their twitter account they only have 50 000 followers and i'm pretty sure that's everybody who works there like nobody even follows them they have no social media presence you won't find them interacting with anybody anyway and they kind of skate under like the radar that way and i guess i would say and i this about Facebook, too, all the time. Like, I'm not going to hate you for being, you know, being truthful about who and what you are, right? You're not, they don't pretend to be anything more than, you know, people who think they should own all the seeds in the world. I mean, they're not lying about that. I guess you can do whatever you want. If you're Monsanto, you're not lying about it. There is something authentic about that. And I think maybe that's it, right? Maybe if you want to mitigate the chance of somebody else saying those things about you, then be crazy authentic. The other side of it is like Patagonia. Patagonia is one of those things where, I mean, my God, like Like clothing manufacturers are wasteful it's bad for the environment and all that is true so how patagonia as a manufacturer can say yeah but we have all of our clothes you can buy on the gray market we encourage you to patch it not buy new i mean you if you had a patagonia jacket that had a hole in it you could take it to the patagonia store they would take it they would you know sign all the stuff give you a thing and say pick it up in two weeks they would send it off to a tailor that Taylor would fix it, send it back. None of this costs you any money. And they would hand you back your jacket fixed rather than you buying a new one because their idea is we actually care about the planet. And I think that when you live your values, then you get to write the script. If you let the internet write your future for you, then you're constantly chasing the idea that that's not who I am. That's not what I meant to say. That's not what I meant. Like, that's not, that doesn't make any sense. And then you end up like, I mean, let's talk about Ben and Jerry's for a second. Like they are, you know, really upfront with kind of like their activist point of view until they were bought by a company that didn't feel that way. And now there's that tug of war between this really independent brand and this corporate entity who says you can't do that anymore. And then you run into this idea that, you know, now a company that was never, that was always allowed to be that way and never, you know, stood on its values and never thought about that. You know, now their CEO quit because he couldn't stand it anymore. And now it's just like any other. It's like any other ice cream company. Right. It doesn't mean anything. It is a real thing. It is a real thing. And I think when you are authentic, it goes your way. And it kind of works for you, right? I mean, if Patagonia was found to have child labor anywhere in its supply chain, I feel like it could get away with it. Like, they could be like, oh, my God, how did that happen? We have no idea if we're going to fix it. It's never going to happen again. And we're going to give these people a million dollars to make sure, like, they can fight against child trafficking and child workers, like, all over the world. And we would all be like, oh, good. Right? Like, we'd be like, okay. So i mean these things really do matter and it wasn't until social media because i remember do you remember new coke do you remember new coke yeah
Marc:Oh yeah new coke man.
Renee:So it was awful it was terrible and but i remember what that so it used to be like in the it was the 80s right so they people would call they would buy as much new coke as they could and then they would call the local news and then local news would meet them at a sewer somewhere and they would start opening it and pouring it down the sewer and all the local news would be there filming it be like people really hate this and be like i really hate this like bring back old coke bring back old coke like and finally coca-cola just gave up and brought back classic coke but that's what we had to do massive scale like boycotting now you just got to tweet it yeah tweet it and it freaks them out enough that they'll change yeah that's what's different now so
Marc:I i have two that i want to talk about Intel first and then Sony. So, okay. So, and I picked these two because first the, the Intel one was 1994. So it's pre-internet, but there's enough, there's enough like communication. That among the, let's say, the techno, you know, elite, that something happens. So, okay, so here's the scenario. It's called the FDIV or floating point division bug. And you may remember this. There's a Pentium chip in 1994. It gets released. And it has a floating point division bug is discovered in the processor itself. So originally, Intel downplays this issue. You know, it's kind of like Steve Jobs holding the phone and saying, don't hold it this way, hold it this way. And, you know, Intel coming out and saying, well, you don't need to do that kind of math. Do math this way. And, you know, obviously that doesn't work. And people get just, you know, up in arms, academics, engineers. They start spreading the problem and, like, exposing it. Email lists, it shows up on Usenet, early webpages. And it's enough that there's enough public pressure, you know, kind of outside traditional media that they actually, you know, change and offer full chip replacements at enormous cost to them. So it's like the first case where online technical communities force a corporate reversal. I thought that was, you know, that was really interesting because the reputation damage, you know, kind of emerged before all of this, you know, the tweets and the and the blogs and, you know, the tick tocks and all of that sort of thing. I just think it was like I was really fascinating. And then I want to talk about Sony. Do you remember the Sony breach?
Renee:I still talk about the Sony breach. I can't get enough of it. That's how we found out everybody hated Adam Sandler.
Marc:So this is no idea. This is the awesome part about the Sony breach. Right. Not that I'm not about Adam Sandler. Right. But but like the breach itself, that's not the news. Right. That's not interesting. It's like, oh, you know, a breach happened. And people like myself, you know, looking at the, you know, oh, look, there is this, you know, this exploit and that thing and, you know, whatever. That's interesting, right? But the news wasn't the breach itself. It was the leak, right? All this information that gets leaked, contracts, people hate other people, emails.
Renee:Do you remember the big scandal that came out of that was they had a, this is how stupid they were. They had an email where the subject line was red envelope. Now, if you know anything about Chinese culture, it's that the red envelope always has money in it it goes to like the first born who isn't married or whatever right and so apparently they they have been like foreign corrupt practices act they have been violating it the whole time yeah in the chinese market you can only get a couple of foreign movies and u.s movies in the chinese market are foreign movies so if you want them to both be sony you have to pay off the chinese government and they were doing it under the title red envelope So they end up, I mean, the stuff that came out of that email was unbelievable. Their security, this is how they found out the security was so terrible and lax, was because the security audits were in the email. You could just open the spreadsheet and see everything that failed. It was just like, did you guys ever pass an audit? Like, it was unbelievable.
Marc:It was unbelievable. Crazy amounts of data. Yeah, racist, you know, racism, sexism, hypocrisy, huge HR issues.
Renee:That was just entertainment. You worked there. That's just what they were like. To me, I was like, yeah, that's the culture. Oh, well.
Marc:The culture of the studios is bad, right? Yeah, it was. I mean, you know, the outcome, senior execs resigned. Films are delayed or canceled. Like, whole films are delayed or canceled. All the legal exposure, employee trust and morale. the government gets involved cybersecurity becomes like a board level issue instantly everywhere right can you imagine like if if, If you're some other business, and I won't name names, but if you're some other business, and you've got a lot of dirty laundry, a Sony type of breach airing all of that dirty laundry, it's like, yeah, of course, it's going to be a board level event. So I love the Sony breach. And it's like, to me, these are like Intel pre, you know, for all intents and purposes, pre-internet, but not pre-internet enough that like it didn't matter, right? It did matter. and people got the word out and Intel's reputation was damaged, right? And then Sony, like their reputation still sucks to this day. There's still stuff like people are trolling through and it comes out to this day. And what year was that? Like, I don't even remember.
Renee:It was 2013 maybe?
Marc:Yeah, 2013. Because I remember I was at Forrester.
Renee:I was at Forrester and I remember coming in that day and they said Sony got breached. And I was like, it couldn't happen to a better bunch of jerks. And part of it was, you know, that IT department had so much money that they were running fiber to the desktops, to the desktops, like for accountants,
Marc:To the desktop, right?
Renee:And then you come to learn that their CIO, who was a younger guy, was going to all like computer world and all these other places and wired and having these conversations saying, why would I spend a million dollars to solve a hundred thousand dollar problem? And we're like, oh, so we finally breached, realized it's really not a $100,000 problem, is it, jerk? And then the other good thing about that, there's two hilarious things about that breach. One is that they had a BlackBerry server and it was just turned off. They didn't unrack it. They just turned it off. And so when all of those drives got rewritten and they had no email, they had no anything, they had nothing, their answer to that was, We got a box full of BlackBerrys and that server. We're turning them all back on and we're giving everybody BlackBerrys and then we'll have email for a minute. And then the other really weird thing that happened was the very first lawsuit that came out of that was not, I don't know, I don't know, like the infallible. You know, usually it's insurance companies trying to be able to pay you for your cyber insurance and all that stuff. No, it was the employees. The employees sued them first for breach of their privacy. And so they end up paying for five years of credit, you know, monitoring for all of their employees just to get them to not sue them. I mean, and this is one of the very first big breaches where you had to deal with all this crap. This is how we realized, uh-oh, like you really do have to manage this stuff. And we really should think about what the breach means because it turns out in like all these years later what is it 10 years later i would tell you today you know the breach is not what's going to cause you a headache it's your response to it that's going to beat you up and so listen it's going to happen to everybody and i said this to google once i'm like i'm like when you get breached like we're not going to get breached i'm like when you get breached like we're not going to get breached i'm like when you get because it's not if it's when how you respond is what's gonna is what's gonna make whether or not you come out of it okay it's not it's not if you do or don't did have you ever gotten one of those briefings from the fbi where they come to your place oh yeah like two guys with guns and and then they tell you the first thing they tell you is there are two kinds of companies in this world those that have been breached and know it and those who have been breached and don't know it that's it that's it Right. Yeah. So, yeah. Anyway, I love the Sony breach. I still talk about it. People don't care anymore. But I feel like it's a good cautionary tale.
Marc:It is. It's like, OK, visibility equals vulnerability. Reputation is part of the infrastructure. Right. The Internet is an amplifier. Right. Not just a mirror. And you've got all the stuff that it foreshadows. Employee activism leaks is leverage. Right. And culture is this kind of risk surface. So, you know, treating reputation as a communication issue is so dangerous. You can't message your way out of systemic risk exposure. You can't patch a narrative, you know, like you can software.
Renee:And brands are now risk surfaces. They absorb pressure from a system that rewards outrage, speed, and symbolic conflict. And most organizations are still operating as if reputation is something you review quarterly instead of something you're exposed to constantly.
Marc:I just, you know, I can't believe 10 years, 12 years on from the Sony breach, you know, however many years past the target breach, however many years past, you know, whatever. And I just can't believe that people still don't like they think it's they think it's, as I said, it's a communications problem. You know, they think it's a they don't they don't treat it like systemic risk. I just I don't know. It's just hard to it's hard to fathom.
Renee:Yeah, not to bag on CISOs, but Chief Information Security Officers. But I think if they were better communicators and they found better ways to abstract what they do into the business, the business might have a different view of that, right? Like, I always feel like they're not really good at telling that story in a way where it's accessible to normal people. And they don't, they never talk about security as an impact to the brand. They talk about security as an impact to systems and end users. And I don't think that's the same story.
Marc:Yeah, no, it's totally not a same story. And, you know, like, like, just think about if the CISO of Sony, you know, kind of started from a different, different approach and said, look at all of this, like, frankly, incriminating damning evidence that sits inside of all of our, you know, information stores, look at this, and then started to basically leak that to the board and said, do you really want this out in the public? You know, like, they should have had, like, a shaming exercise. Like, that's where they could have succeeded, right? Maybe it wouldn't have changed the culture, but maybe they would have tightened up about it. And I don't know.
Renee:Yeah, the kind of stuff they were putting in email was, like, absolutely ridiculous.
Marc:Yeah, but maybe every CISO needs to be, like, the internal, like, you know, tattletale or something. You know, the guy that knows where all the bodies are, right? The guy that knows where all the skeletons are.
Renee:Or at least help the auditors know where it all is. Like, that's what I could do. Like, hey, audit, come here. You got to see this. This is crazy. And then you didn't do what they did. And that's what they're there for.
Marc:I mean, it's... Because CISOs would be so much more successful if they demonstrated to boards, like, this is the reputational risk that you're pushing off here. And, you know, with Sony, I mean, you look at, it impacts them to this day. You know, talent won't, some of the contracts that, you know, people put up with Sony now are like super protective, you know, very territorial, you know, and they, they just wouldn't have had that if, if they, you know, had been protected. The uncomfortable realization here, right? Right. So, yeah. So, so the internet, the internet isn't, isn't failing, right? Because it's evil. It's failing because it assumes better outcomes than it was designed for. Right. Like, like you think about, you know, all the assumptions that we, we, not we, as in you and I, Renee, but you know, the people in the earlier days, right. The academics, right. The thoughts that, you know, there were going to be better outcomes because the, the systems worked a certain way or they were designed. A certain way, it assumes that good faith without predicting against bad behavior. And that was kind of the operating assumptions going into the beginning of the internet era. You know, it assumes shared reality without maintaining it, though. We built systems optimized for connection and scale, but not for restraint or memory or, I love this, accountability, right? And I tell, you know what I tell my teams all the time is like, because it's this sort of, you know, agility thing, right? Well, we're agile, you know, we're autonomous. And I'm like, you know, teams can't be autonomous if they're not accountable. Yep. Yep. Autonomy without accountability is not an operating model.
Renee:Mayhem. That's mayhem. It's called mayhem.
Marc:So when these things disappear, you know, the system, you know, is challenged. It just starts rewarding the wrong behaviors very efficiently. So we treat human behavior as a constant and technology as the variable when it should have been the other way around. We assume people would rise to the tools instead of asking what the tools would pull out of people once they were optimized at scale. Like we talk about this all the freaking time, you know, human behavior doesn't change. Basic needs and wants don't change. Humans are the same generation to generation to generation. What changes is the technology. You know, it's not the other way around. It's not the other way around. humans aren't like and you know what we're sort of we're sitting here kind of with rose tinted glasses because we're like oh but wouldn't it be better if the platforms didn't do this and wouldn't it be better if the this didn't do that it's like, you know what, Renee, we're just as naive here because like humans are- We're just, yeah, you're right. We're nostalgic. But, you know, that stuff, it's not going to change. Human behavior is not going to change.
Renee:We're not. And the brands ended up standing at the center of that system, absorbing pressures they were never built to carry. Not because they did something wrong usually and not because they were reckless or irresponsible occasionally, but because they were visible, because they were legible, because they existed at the intersection of daily life, emotion, and public discourse in a world where institutions felt distant and unreachable. Brands became shock absorbers for frustration that had nowhere to go, economic anxiety, political polarization, cultural conflict. All of it gets routed through the most visible surfaces available. And for a lot of people, that surface is a logo.
Marc:Yeah, it is a brutal way to learn that visibility is vulnerability, right? That reach is exposure. I love that. That being known at scale means being interpreted at scale. And once that happens, neutrality disappears. You don't get to just operate anymore. You get to signify, right? And it sounds lofty, but it's dangerous. And you get judged not on what you intended but on what the system turns you into.
Renee:Isn't that awful? Who wants to do marketing? And you know what the other thing is? Like usually, like that, like think about how dangerous that is. Think about how out of hand that gets so fast. And yet, when you look at who's in charge of the social media, these big brands, it's like a 24-year-old kid because they know how the platforms work. Not that they know anything about the behavior, the messaging, what that all might mean. Nope. They just can fire stuff off to Twitter faster than anybody else.
Marc:Like it's good. I've sat in some of these meetings around, well, let's tweak the algorithm to do this. Let's tweak the algorithm to that. And the outcome is not, you know, do the users get a better experience? The outcome is, well, does that drive more revenue? Does that drive more sales? Does that drive more... You know, the dirty word, engagement. And, you know, it's like, oh, oh, when did customers lose out?
Renee:You know, I used to tell marketers, because, you know, you work at Forrester and you occasionally find yourself sitting on the other side of the business. I was in technology, but you had a whole marketing, you know, bunch of analysts who did nothing but try to figure out a way to optimize platforms. I mean, that's really what, like when you talk about the personalization of advertising, dude, that was Forrester, like the digital transformation. I saw Forrester did all that, right? And like you would sit over there and you would think about that. And I would tell them all the time, every time you guys do this, I need more followers. I need more. And then all of a sudden you have 10 million followers. You're like, oh, my God, that's Kardashian numbers. Like, yeah, you know what? There's a risk manager somewhere that's sitting in the bathroom getting drunk on that bottle of vodka they keep under the sink because you just created a firestorm of crap that is going to bury us one day.
Marc:Yeah, yeah.
Renee:And you don't even know it.
Marc:We talked about this in the advertising episode, right? You know, about influencers and influence behavior and what a risk that is. Like, you know, the TikTok bombs or, you know, this or that, you know, all of a sudden something blows up and there's some, you know, relationship issue or somebody is divorced with somebody else. And like that stuff, that stuff matters to those followers, you know, and, and it's, it's like, do you really want your brand associated with that? Like, look at that risk, you know, that, and that scale, it just, the risk, the risk number just keeps going up. You know, the risk register, I could just see it climbing into the top right.
Renee:And I would love it if they actually updated that risk registry, but they don't. Like, they just think whatever assessment they did when they had 10,000 followers are still doing it at 10 million. I don't understand why that's true. You know, if we think about, like, I'm nostalgic for the past always, always, always. But sometimes I miss the slowness, the waiting, the being online felt optional, like a place I went to instead of a condition I lived in. You know, you logged in with intention. You logged off when you were done because if you didn't, no one could use the phone. You know, I didn't, it didn't follow you around in your pocket. It didn't ask for your attention every three minutes. It gave, it didn't give you anxiety. You just weren't there enough, right? It gave you room to think before you spoke. There was a boundary there. A sense that the internet was part of your life, not the infrastructure of it. Like I do, I do everything on my phone. I grocery shop on my phone. It's terrible. That boundary mattered more than we realized because it gave people space to be unfinished, to change their mind, to not have every thought immediately witnessed and judged.
Marc:I, uh, I miss when saying something stupid required effort. You know, you have to sit down, you type it out. Like how many times you've done this in email, right? Like, I don't even use email anymore. Literally. I don't, gosh, it's so rare that I email now. Yeah. You have to sit down, you type it out, you connect, you think about it, you wait, you know, you pause, you go spell check it, you know, whatever. There are enough steps, you know, in between that, you know, most of the bad ideas died quietly before anyone else ever saw them. Like how many emails? I always look to see like how many emails I have in my drafts folder. Right. Those are all the bad ideas that I just never said.
Renee:Like don't type angry.
Marc:Yeah, don't type angry. You know, but that distance between thought and publication now, it's just it's kind of gone. And that's not a small change. And I think about, like, again, I've sat in some of these meetings with, you know, various platform players and that and talk about user experience and, you know, what will get the most engagement. And, you know, they talk about, you know, time to post, you know, from the time, like on Instagram, you know, I have a photo from the time I click the button to the time that it actually posts. Like, how can I shorten that? You know, how can I make that easier and less friction? But, you know, that friction is sometimes kind of important, you know, That's the sort of shift in how humans interact with each other now.
Renee:We didn't lose the Internet. It didn't disappear. And it didn't get taken away from us.
Marc:Yeah, we just lost the illusion that connection alone was enough, that proximity would solve disagreement, that visibility would create understanding, that access would automatically produce empathy.
Renee:What we learned instead was connections need context. It needs friction. It needs governance.
Marc:Without those things, connection doesn't symbolize us. It overwhelms us. And nostalgia isn't, you know, about wanting to go back. It's about recognizing that we stopped protecting along the way.
Renee:So the internet wasn't a failed utopia. It was an incomplete one. We built connection first and figured out the rest later. And that means there's still room to finish the job more thoughtfully than we started it.
Marc:Gosh, I hope so. I certainly hope so. We're not just living inside the consequences of that experiment. We're learning from them, I hope. The internet is still being written, maybe, whether we admit it or not.
Renee:We're going to be pessimistic. And the fact that we even have these conversations, reflect on what worked and what didn't is part of what made the early Internet a special place in the first place.
Marc:So maybe the next chapter isn't about going back. It's about designing forward with a little more humility and a lot more intention.
Renee:Dude, regulate the Internet. I'm sorry. I just I had to say it. I'm Renee. I never met a reg. I didn't like Murphy. Hey, you guys. And I mean that. I mean, there's actually right now, as we speak, there's probably a couple different bills in the hopper to talk about how to regulate social media. And gladly, you see it happening in other countries. So maybe it'll take on like a kind of a world of its own that you can't use algorithms to manipulate people. And if your algorithm does manipulate people, you should be held accountable for that. I mean, all those things are really important and we shouldn't leave them, you know, dead on the ground. right? All right, you guys, thanks for listening to Nostalgic Nerds.
Marc:And we'll see you next time, assuming the modem connects us up.