More episodes
Telemetry Now  |  Season 2 - Episode 49  |  June 27, 2025

Telemetry News Now

AI Data Center Power Demands, Gartner’s AI Agent Prediction, AI Copyright Woes, 5G Infrastructure

Play now

 
In this Telemetry News Now episode, Phil and Justin discuss the future energy demands of AI infrastructure, including eye-opening predictions about GPU power consumption. They discuss Gartner's skepticism regarding agentic AI projects, cautioning against inflated expectations. The duo also covers Cisco's latest quantum networking moves, legal battles over AI data rights, innovative private 5G networks at UK ports, and new high-capacity networking investments in Scandinavia. Plus, updates from Panama's internet shutdown and the latest industry events.

Transcript

Telemetry News Now.

Welcome to another episode of telemetry news now. We are recording right here at the tail end of June, and, well, I I just got back from vacation. I took my family, Justin, to, to Disney, Universal, the Space Center, all that in the Orlando.

It was a lot a lot of fun, but I am telling you, it is hard to get back into work mode.

And I know you were on PTO for a couple days as well. Did you go someplace cool?

Yeah. My wife and I, our daughter was with her parents, so we took the time, went up to Chicago for a couple days, one of our favorite cities, and did a little sightseeing and had some nice meals. So, yeah, it was a good little break. And, Phil, I don't know how the weather was in Orlando, but, the US is currently under this heat dome, I think they're calling it. Right? So Chicago was nice. It's a little cooler than it is here in Saint Louis, so it's nice.

It was hot. But I think Orlando is always very hot.

But it was ninety This time of year.

And sunny every day. We did get the heavens opened, and the torrential rain started when we were in, Hollywood Studios at Galaxy's Edge, which was just so fun. We had a ton of fun there. We love Star Wars.

So but we did have to go find shelter. But it was a great time, and I'm glad to be back at it. And we do definitely have some news to cover, but my goodness, so much AI stuff going on. But we do have some news for you that is not necessarily AI related.

Let's get started.

From Network World last week, a new KACE roadmap. If you're not familiar with KACE, that's the, Terabyte Interconnection and Package Laboratory, TeraLab, at the, Korea Advanced Institute of Science and Technology, hence KACE. Anyway, their report warns that future AI chips, which would be powered by high bandwidth memory, HBM eight, that's what they're specifically citing, it could consume over fifteen kilowatts per module by two thousand and thirty five, so in ten years. And that far exceeds today's infrastructure capabilities, something that, Justin, you and I have talked about a lot on this on this show and on the main podcast, power, cooling, infrastructure, real estate, all of that stuff.

And it's and it's pushing grid systems to their limits. So the report does go on. The dramatic rise in energy demands, you know, driven by the scaling needs of AI workloads, that's what we're talking about here. And, you know, we're we're talking primarily about LLMs for the most part.

That's what they mentioned in the report.

But there are other huge models in production being developed that focus on computer vision, advanced predictive analytics. But anyway, it's it's reshaping data center architecture, chip design, global infrastructure planning. We've been talking about this. There's a geopolitical element. And this is when we start talking about things like liquid cooling, three d stacking, real estate, and and zoning and location based deployment strategies. All that's becoming totally absolutely essential like it never has like it's never been before since basically we're seeing electricity cost and I mean, I guess also electricity availability emerge as, like, the bottlenecks for AI growth. And I don't know if anybody really anticipated that, but that's where we are right now and, interesting report to read through.

Yeah. And I guess if I'm understanding this correctly, we're talking about fifteen thousand watts, fifteen kilowatts per GPU.

Yeah. Per module.

So if, like, I were to build a machine, that had eight GPUs in a single server, Right? You're talking about eight times that just for a single server. That's just crazy to think about.

Well, we're, you know, we're talking about rack after rack of that, so that's why we are, you know, looking at power as as a major bottleneck. I mean, multiple kilowatts per rack is is insanity. I mean but that's kind of the nature of the progress of AI right now. It's brute force. It's adding more more horsepower, more memory computes, the storage, more GPUs, all requiring more power and and square footage and all that stuff. And that's been the method thus far. And and, I mean, once in a while, we get these little glimpses of folks that are doing things to make models more efficient or training pretraining specifically more efficient.

I mean, I think we have to. Right? We can't keep doing this, like, scale to where we're exponentially putting demands on the power grid and then having to come up with new engineering solutions to cooling these chips and keeping them from overheating. I mean, at some point, we've gotta start figuring out a way to make things more efficient, the models more efficient, or require less GPUs, things like that, I I think.

But Well, the thing is that we're talking about foundation models and and what many call general purpose models. And so they're very broad, very, very broad in scope. If you can imagine, like, using ChatGPT or Claude, you know, it's trained on all data of the Internet, publicly available data. Right?

So and then, of course, you want inference to be very, very fast. So, you you you know, it needs those resources. But that's not necessarily what we need for all, you know, AI moving forward. There is all those use cases where you can have specially trained models that are much, much smaller in scope as far as the number of parameters and then all of that stuff.

So that way, you know, they can operate in a much more efficient manner yet still produce the results that we need. So, you know, we hear about that from time to time, and I think we're gonna hear about it more and more, especially as this bottleneck becomes like a complete brick wall, you know, as far as power and and, data center infrastructure and that sort of thing.

Yeah. For sure.

Okay. So from SDX central yesterday, and I think you're gonna like this one. According to I like this one at least. According to Gartner, beloved Gartner, the future isn't actually agentic.

Alright. So what does that mean? Basically, they predict, this is Gartner, predicts that over forty percent of agentic AI projects will be abandoned by twenty twenty seven, it's right around the corner, because of rising costs, unclear ROI, and overhyped expectations with most of those current initiatives being early stage experiments. Some people call them POCs.

Now this is a soapbox that I have been on a lot lately. Mhmm. And, no, I I don't I don't think we're necessarily headed to the next AI winter. And we've had those in past decades where things were overhyped, unmet expectations, funding dried up from industry and academia.

But I do admit, I tend to kind of agree with Gartner here. Not not because I think that, agents are hype in and of themselves, like, from a technical standpoint. Like, oh, they don't really do anything if it's vaporware. No.

No. No. I I do believe that there's real value here, and and that's awesome. But I don't think most organizations really understand what it takes to actually implement a real production AI solution.

I think that's gonna be the wake up call where folks are like, alright. Maybe we shouldn't be approaching this. Right? Anyway, so despite the skepticism, Gartner does still expect aGentic AI to grow, forecasting that by twenty twenty eight, fifteen percent of daily work decisions and thirty three percent of enterprise software applications will incorporate it in some way.

But I like this. Ready? They warn against agent washing. We've had AI washing. We've had intent washing.

Remember intent based networking? And we said everything was intent washing. Mhmm. So it's basically a rebranding of basic tools as agentic, and then they advise organizations to prioritize strategic value driven use of AI agents.

I mean, isn't this true for all software? But in any case, we've seen that time and time again in this industry. The rebranding, I remember everything was SD something when, you know, SDN came out, software defined networking, and and everybody rebranded. And under the hood, the tech didn't really change that much.

You know? Mhmm. I wish we had a lot more time, Justin, to talk about this in particular. Like, we can make a podcast out of this because I talk to a lot of technical leaders about AI initiatives.

Right? They wanna look into some projects, some something.

And a lot of the time, they don't know what. So we start breaking down ROI, and we they realize, look, we have no idea how to measure it. They don't know what their goals are. You know, they don't understand that well, many people, some do, that there's a difference between technical KPIs, how good your model is, and then, like, business KPIs. Like, oh, it's great that it's efficient and you have low latency and it was accurate, but who cares? It didn't give me anything valuable to the business.

And yeah. Also, there's a huge cost of very little benefit.

Oh, I love this one. So sometimes there's, like, a real goal, and we kinda go through that. Right? We go through that motion.

And then we start talking about data pipelines. Yeah. And you get this wide eyed look like, oh, we have no data in one place. That's a big mess, and I have no staff to do that.

And I'm like, well, your AI initiative is pretty much, you know, stopped then. So I think I don't know where Gartner comes up with those specific prediction percentages, by the way. I have no idea where thirty three percent and fifteen percent. I don't know how they but I do agree with this mix that they have of both skepticism and also, like, this prediction that we'll still see growth in in this agent ecosystem.

Yeah. I mean, I think whether forty percent is the number that will be canned or not, who knows? But I think we've talked about this on the podcast a number of times that a lot of these organizations, like you said, the leadership is getting into, well, we've got we're gonna be behind if we don't start figuring out what our agentic AI strategy is. And it's like, well, you don't start with what the business problem is you're trying to solve and what the ROI of the investment in AI or in any other tech, like you said, for that matter is, it's fraught with failure.

Right? Because it's like, otherwise you're just experimenting, you know, because you're in a panic that your competition's gonna leave you behind. But I think the other sixty percent that aren't gonna get canned is where people started from the other end. Right?

They they have very tangible use cases. They have things that are being done very manually today or very labor intensive today that are very repetitive tasks that AI as a technology lends itself very well to help people solve. And so applying agentic AI to some of those type of things where the goal is very clear, the ROI is very clear, it's just a matter of proving that the technology can actually solve that problem, those are more likely to be successful. So, yeah, it'll be interesting to see how this prediction plays out, but it is interesting to see even Kartner themselves is, I guess, signaling a little bit of caution.

Right? Like, just because it has AI on it doesn't mean it's gonna be successful and that it should be invested in.

Yeah. Yeah. I agree with Gartner on this one just because the this is my life. You know, these are the conversations that I have with folks, and they're like, oh, we want to do this, this, and this.

And then we start to discuss the the technical nature of it and the and and mapping that to business outcomes and stuff, and it sort of fizzles. You know? And they're like, well, can we start a POC? And I'm like, a POC to do what?

Like, what do you want to do? And and if you do know what you wanna do, it's like, well, do do you have, like, five data engineers that I can work with to get, you know, these pipelines going? And, you know, we'll we'll know. And I'm like, well so I think the reality is that it's just a lot harder to do in practice.

You know, it only it goes back to the build versus buy argument as well because we're gonna see company but Kentik is doing it, but we're gonna see other companies as well Oh, yeah. Provide like that whether it's the the end result, like, we're doing the actual, application of a model for you, awesome, and we're doing that, Or just the the data part of it, which Kentik also does, but, nevertheless, like, you're gonna see that, like, you know, data engineering, data pipeline as a service, and then it's all prepped and, you know, the pipeline's there, and you can feed it into your models and and service that way.

So the you know, there there's actually quite a bit of new opportunity here as well. We're going along with this particular headline, but I think it's important. So I do think there's a lot of opportunity here. As much as as much as there's, like, the the agent washing and a little bit of hype and some people are predicting the next AI winter, I do see another world of opportunity for for businesses, for for folks to kind of develop in their careers as well.

So Yeah.

And I also wonder whether, a lot of the advancements around MCP and a to a protocols and so forth will help with some of that, where you don't have to build everything yourself. Right? You could tie in through an agent that a product like Kentik has with another product that you may have. Sort of like the days of connecting two products together via APIs, but in this case, it's a lot more intelligent because each side is doing the, you know, the their own AI of their own data where they know it a little better and then being able to stitch those together using something like an agent to agent.

And I don't know. Phil, I think you've done a lot more, reading and research and playing on MCP than I have, but, you know, potentially, if you don't have all of your data in one place, like the problem statement that you mentioned earlier, you have it in various different places. MCP may be able to help with that, right, where you could train on a data, lake that lives somewhere else or if it's exposed through an MCP Mhmm. Server.

Is that the way I should be thinking about that?

I mean, that's exposing the tools and the agents and other things like that, making them available for sure. So there is some value there, but the data itself needs to be cleaned and prepared. And and and there just needs to be a lot of thought through that, especially for what we do in networking. Like, not just cleaned and prepared for historical analysis, but for real time analysis. You know?

So doing all of that, MCP certainly plays a role in the sense of, like, making tools available and and accessible and things like that and hooking into other systems. You know, we're familiar with APIs, so this is not, like, groundbreaking, you know, technology in that sense. But there's still there's still a ton of work to do on the back end, and it's good work. And it's probably seventy five, eighty percent of the work that needs to be done before any kind of a AI initiative gets off the ground, at least if you wanna be successful.

And you don't wanna be stuck in the never ending POC, which, you know, I've I've been there. We do need to move on, though. So from NetworkWorld, on June twenty fourth, Cisco has joined a ten million dollar funding round for Qunnect. Connect spelled, interestingly, q u n n e c t, a quantum networking startup developing room temperature.

That's the key. Telecom compatible telecom compatible quantum hardware designed to work over existing fiber infrastructure. We talked about this kind of stuff recently, Justin, on a different episode, but same same technology.

So Connex products include entanglement photon sources and repeaters. They're built for real world deployment rather than, like, lab only environments, I guess. And they're already being tested in cities, like New York, New York City, Berlin. And, Cisco in the article says, sees Connect as a key partner in its broader quantum networking strategy. And that was telling to me that that there is a strategy here. And we already knew that, of course, because this is now multiple articles that we're seeing about Cisco working in this space, quantum networking.

And and then specifically in this case, aligning with its own work, Cisco's internal work on room temperature quantum entanglement chips, you know, which I would assume power future quantum data centers among other things, but also long haul. So, you know, we talked about this, Justin. I think it was last month. So clearly, we're seeing more movement in this direction. Really, really interesting stuff. And, it's just interesting, also, not just interesting because it's cool, but this shift in the fundamental nature of how we do networking.

Well, that's what I was about to say. Like, the the technology itself is fascinating just, you know, from an intellectual perspective. Right. Yeah.

The the applications that this could unlock are gonna be really interesting to watch. Right? If the theory of quantum networking works out where you have near zero latency over any amount of distance, fundamentally changes the way we've done networks for, well, I mean, basically since networks were invented, right. Because the lot of the ways that we have built and designed our networks took latency into the factors of how you build it, right.

If you think of like high frequency trading where latency is absolutely critical to how quickly you can make a trade, lot of time and energy and engineering was put into making sure that you're as close to the exchange as possible and fibers were built in a direct line instead of taking the longer path, even though it was more difficult and more expensive to do just because that latency was so critical. So if quantum networking actually can do what it's what the promises are that you can have near zero latency, let's say, from New York to LA. Right? Like that could fundamentally shift the way we could build a network.

We could even have potentially an AI fabric where we're doing training of models and one GPU is in New York and one's in LA. And that's perfectly fine over that long distance because latency is not a factor.

I was actually thinking about that. Yeah. That's true.

We're long I think we're still a long ways from that, but those type of applications, it would be amazing to think about.

Yep. I agree. I agree. Really interesting stuff. I can't get into it much deeper than that because it really does go beyond, you know, my purview of knowledge, but, really yeah. So alright. Let's, let's get to the next one from an article on the data center dynamics news page.

Panama, that's the nation, the country of Panama, has declared a state of emergency in Bocas del Toro following a mass or mass protests by banana workers over pension cuts. Also, some layoffs, I think low wages as well, looking at the article there. And there was somewhat violent unrest and even damage to infrastructure.

And so in response, the government of Panama imposed rolling Internet blackouts. They restricted constitutional rights.

They were going around arresting folks without warrants, and, of course, this drew criticism about, you know, limiting freedom and expression and economic activity. Right? Because that's you know, the Internet does represent that. You know, this isn't new in the region, and, frankly, the Internet connectivity in that region does tend to be poor quality anyway, more or less in the in the rural areas, of course. But it's still not good news for protecting access to information. And, frankly, you know, like I was saying, the modern economy, the Internet is sort of a proxy for that, for the folks in Panama.

Yeah. And I haven't seen any reporting from Doug Mentoria on this. This is a kind of thing that he covers quite frequently and is, in fact, part of an organization, I think, called Keep It On, trying to put pressure on governments and so forth to keep the Internet on when things like this are going on, because it is so critical, right, to the infrastructure and to the society in these places. And I know Doug had done some, reporting recently about the internet in Iran as well.

-Mhmm. -Right? Where it was shut down for, I think, like, seven or eight days during the recent, you know, conflict between Israel and Iran, and some of the bombing that was going on. This the country of Iran had actually cut off the Internet to its citizens to keep them from, presumably, from being able to get news, to find out what's going on, to potentially even have, Israel doing cyber attacks inside of their country.

If you turn the Internet off, they can't use it to do cyber attacks presumably. So there's a lot of reasons that countries do this, but obviously the Western nations are trying to convince them to to keep it on during these types of things. So moving on to the next article from NPR, title is in a first of its kind decision an AI company wins a copyright infringement lawsuit by the authors. So, you know, there's been a lot of discussion about data sovereignty and who owns the data.

If I'm a published author and I put my works out on the Internet, does that mean that the various different models that are training on Internet data can train on that? Is that, open source? Is that copyrighted? I mean, it is copyrighted because someone took the time to actually publish that.

So there's been a lot of press around this, and now we have, one of the first rulings from a federal judge in San Francisco in a copyright infringement that basically is saying that if the authors are putting their material out there and it's copyrighted that the, I guess was in this case was anthropic that, the lawsuit was against cannot actually train on that data. So this is gonna be, interesting to keep an eye on and to watch how these types of, lawsuits unfold. Presumably, there's still going to be some appeal to this, that this we haven't heard the end of this as legal proceedings tend to work, continues to have appeal, on the lawsuit.

So this is probably not a final decision, but interesting decision by a federal court in San Francisco anyway.

Yeah. Yeah. I mean, so now AI companies, like the I think it's I don't remember the exact wording, have the right. As long as they obtain these copyrighted works, legally and ethically, they can train on that. And that's that's a big deal. And, this is not just anthropic. We're seeing news, coming with regard to Meta, to Google.

Microsoft. I saw an article, all related to this to this. So I don't know. Is it okay that AI companies can legally use copyrighted works as long as they, you know, acquire those works legally, to train their own models? You know, again, if they obtain those materials lawfully and, you know, the judge calls these things like fair use transformative. But, you know, the thing for me is that everything that, a model is trained on is now kinda, like, memorized by the model, if that makes if that makes sense.

And, like, what if you don't want your data like, you know how you can, like, scrub your data from the Internet if you wanted to? You can you can do those things and and delete things. You can take your your blog post down. So, Justin, on on your blog, I said what is it? Ryburn dot com or something?

Ryburn dot org. Yeah.

You could delete a blog post. Well, you know, if a model was already trained on that, it's kinda memorized by the model already. So it just I get what the judge is saying, and I understand totally understand, but there is a little bit more depth and complexity and nuance here. Another thing is is it's not mentioned here, and we are talking about, like, the future of AI copyright law.

Right? And I have to assume there's gonna be some appeals. Right? This is not this is not the end.

But Oh, yeah. For sure.

What's interesting to me is that a lot of content copyrighted or not copyrighted is actually generated by AI now. And so I think we're gonna have this, like, downward spiral where future iterations of of models are trained on, you know, new stuff on the Internet, which was generated by AI.

So that's what the the new training data is, and it's gonna become more and more as far as a proportion as new models come out year after year.

So what does that mean? You know, is that copyrighted where they're not allowed to use it, but they're sorta like they generated it themselves technically? You know, if and if Claude was the the model that created the content and it's training itself. There's a lot there's a lot of complexity and nuance here beyond just, like, copyright law. I really I really believe that.

Yeah. I think that the deleting of the data is gonna be a really, important one. Right? If you look at things like GDPR in Europe where, if you don't want to be it's it's more for like contact and outreach for marketing purposes, but if you don't want your information being used for marketing purposes, you can request that it's deleted and, you know, the European Union has been very clear that your your data and your information is yours, and that you should have the right to have it deleted and the vendor has to Mhmm.

Quote, unquote prove that they've deleted it right. So they've put some meat when it comes to personal information. They put some meat behind some of those laws, and I I got to imagine they're gonna do the same, with some of this copyright stuff. Right?

Where, yes, it's okay if you legally obtained the materials, like the book is the example they're using in this article where they were saying that the the book was basically acquired by Anthropic. They actually paid for it, and so then they train their LLM on the content of some of the books and so forth. So, you know, it was legally obtained, they can train on it, but presumably if the author and the publisher comes back to them and says, hey. You need to delete this from your training.

They have to be able to prove that they did, in fact, go through and and delete out the model. Which I don't know how that works. Right?

Once it's already been trained on it, how do you Well, that's how do you reverse that?

You're that's, like, near impossible. So I don't think that this is realistic, and that's why I think this is really weird the way it's gonna go when we're talking about models that have been trained over months and months and have hundreds of billions and even trillions of parameters and all of these things that are dependent on the the data that they're looking at, you know, whether it's, you know, high quality or not. So I think that's gonna be that's what's gonna drive a lot of the conversation is that, well, we really can't undo that. You know?

So and and you know what? For a lot of folks, this stuff is a black box anyway.

Like, when the training is done, it's like, well, how does it actually generate that thing? Like, this is the data. How does that you know, you could start talking about cross entropy if you want, but it's it's really convoluted and complex. And I don't know what the answer is, but it is interesting that we're starting to see already this involvement from, in this case, the United States, but, of course, we're gonna see it from other nations as well.

The involvement of the government and and, you know, our legal system to sort of iron some of this stuff out. It's gonna affect everybody. It's gonna affect everybody. Education, legal, you know, medical, any industry, any field, and then for on a personal level as well.

For sure.

Alright. Moving along to the next article from SDX central Verizon business, Nokia, win Thames Freeport private 5G network deal in the United Kingdom. So in the river, I guess it's Thames. I pronounced that wrong. River Thames. In the United Kingdom, there's a port called the Thames Freeport, which is one of the UK's busiest shipping and logistics centers.

And, Verizon Verizon Business, that is, has, teamed up with Nokia. They have won the contract to deploy 5G networks at the port. Actually, multiple of them. My read of this, the article's pretty sharp, but my read of this is they're actually using some of the 5G technology that allows you to build private networks on the 5G infrastructure.

So it's not necessarily 5G connected to the Internet in the way that you or I would connect our cell phones Philip to to a 5G network, but actually private VPN type of networks that are running over 5G. So again, Verizon business and and Nokia teaming up have won the contract to provide that type of connectivity to what's, the busiest port here, at least one of the busiest ports in the United Kingdom. So Yeah. Presumably, it's like the ships come in and out, they'll be able to connect to this 5G private network.

So Yeah.

But it's gonna be a template, I think, because this is the first, like, large scale version of this that I think will be, like I said, like a like a template for others then, like a reference, you know, for other industrial 5G at, you know, within the greater continent of Europe and and the world. But there's also this idea of the the you know how we have CDNs where you use, the CDN network to get data out to all the individual people's homes and of course there's, you know, pops and all that stuff. It's kind of reverse of that because now we're collecting data from all these individual points.

And and you can imagine the data coming in from ships and all of the stuff at the these huge industrial centers. In this case, we're talking about ports going into, and then being not bottlenecked, but being funneled into systems to do AI analytics, AI driven analytics. We're gonna see all that. I I mean, think about the autonomous vehicles that we're gonna say, say safety monitoring, various types of orchestration mechanisms across these huge ports, whatever.

I don't know what to call them, but you know what I mean.

Mhmm. So it's also a move for Nokia. They're sort of strengthening their whole 5G edge stack thing, you know, and being a leader in that. And so that's interesting to me.

And we know Nokia for a lot of their, you know, their their technology in the radio world. Obviously, they have a a great data center product as well for data center switching, but it is interesting to see them, you know, expand into that area. So I I think, you know, it's it's also signaling where the industrial connectivity world is going. Right?

Rather than putting your, you know, industrial switches and running cable and fiber, I I think this is sort of signaling that this is where that part of the the networking world, that particular, you know, vertical I don't wanna call it a vertical because it's very generic. Industrial networking, whatever you wanna call it. Right? So I think that's where it's headed.

This is a signal for that.

Well, this was one of the promises of 5G as a technology when it first rolled out, right, with these private networks that you could build and the fact that the bandwidth and the the connectivity is now good enough that you don't need wired connectivity and and that you can secure it and have it be private over a 5G infrastructure. So this is one of the the proof points of that, like you said, for sort of an IOT or an industrial type of use case. So, yeah, I'm sure we're gonna see a lot more of this as time goes along.

Alright. Moving along to the last article for today from Computer Weekly, Aurelion upgrades their Scandinavian network to support the AI superhighway. So start with AI, Phil, we gotta end with AI. So Of course. Aurelion is, rolling out more infrastructure, in their their Scandinavian network to upgrade the bandwidth based on one point six terabit waves, using four hundred gig coherent pluggable optics from Sienna, actually, to build out more infrastructure and more bandwidth to allow for more moving of AI workload data across what they're calling their AI superhighway in between Oslo, Stockholm, and Copenhagen.

That the whole AI thing, I think, is the key here. Right? We're talking about training clusters that are distributed and, inference farms are now demanding this huge, huge east west bandwidth. We're talking about one point six terabytes per second connectivity.

And is that gonna be future proof? I don't know. You know, maybe not not the way we see things, but it's certainly a step in that direction.

And also, you know, this idea of, like, the that whole regions, they they they bought in and developed the whole renewable power and and stuff for years now, and the the cooler climate, is probably gonna attract probably attract hyperscalers that want to have that connectivity among their distributed data centers that are working on AI workloads.

And, so yeah. Yeah. And and I think, you know, the addition of the green energy part is gonna make that a lot more viable as well when you need, you know, many megawatts to make everything work. So, yeah. Yeah. I think this is really neat, and, and it makes sense, and it's really cool from a pure network engineer perspective as well. I think, you know, what we're talking about here is, twenty thirty, I think, is is is around when we're gonna start to see some of this come to fruition.

Yeah. And the the other thing that was interesting in the article to your point is it talked about how much investment is being made in the Nordics in the data center.

Infrastructure. Right? Again, a lot of it has to do with the fact that the cooling is a little cheaper because you're in colder climates, but, the article is saying that the region's data center construction market will reach seven point three eight billion by twenty thirty, which is a compound annual growth rate, year over year growth rate of twenty three point four seven percent.

Some are estimating that, the Scandinavian AI market will expand, by twenty six point two four percent compound annual growth rate to reach, nineteen point nine billion overall in twenty thirty one. So presumably, that's not only data center, but that's all the money invested in, power, space, cooling, compute networking, all of that on top of the the infrastructure which is seven point three eight billion. So, yeah, a lot of money being spent in that part of the world, to build out these data centers and and do a lot of the training and inference like we see in other parts of the globe.

And it makes sense for the reasons that we've described already. And, of course, we want more bandwidth, always want more bandwidth, and we have the, you know, the the default ability to cool more easily and renewable energy all sounds great. But we see stuff like this, at least I do in the news all the time about such and such a company bought, you know, four hundred acres in the middle of nowhere. All this stuff happens all the time.

So where is really the boom gonna be? And I wonder if number one, maybe it's everywhere, anywhere and everywhere, anywhere where land is cheap. And that and that's where I'm going with this. You know, you the footprint the physical footprint for some of these data centers is enormous, and the land is expensive.

And and you need the land to be able to build on in the first place. You know, if it's big flat land, like in the middle of the desert or whatever whatever, you know, place in the United States where that's commonly done. Is that really, like, that viable to scale that large in that part of the world in Northern Europe? I don't know.

And all those other benefits are certainly factors that I think are gonna drive growth. We're talking about multiple billions. Sure. I get it.

But I don't know. I hear about this from, these kinds of investments happening everywhere. So I wonder I wonder where it really will take off and become the new, like, Ashburn, Virginia kinda thing.

Yeah. I don't know if it's gonna be quite, you know, Northern Virginia Ashburn area, but I, you know, I think this is definitely a a big density and footprint in in Scandinavia for a lot of these data centers. It's interesting the different business models you see with this too. Like, I was talking to a data center operator a couple weeks ago in in the United States, and they are buying land outside of metro areas where typically people don't buy land in space because it's not necessarily right next to the fiber plant. Alright. So latency is not fantastic in that area, but they can get a lot of land. They can get it for fairly inexpensive.

They don't have great connectivity to the power grid, they don't have great connectivity to the Internet, but that's okay because they're doing solar for their power. So when they buy this land and they they start building out the infrastructure, they put in a big solar farm. So they're generating most of their own energy using solar, and then, you know, they can connect into the Internet. It may not be as great as it is in Ashburn, Virginia, but they do have connectivity.

You don't really need super low latency for the inference part of it. Right? The interaction with it, you really need the low latency inside of the data center for the training. So they're not as worried, you know, another ten milliseconds of latency getting data in and out of the data center. You know, is that that big a deal? Probably not. So just a lot of interesting different approaches that folks are taking to this for various different parts, of the AI stack, if you will.

Well, keep in mind though that as we I know we just talked about how Gartner said the future is not agentic.

But if we are using agents more and more and a lot of folks are building their application using foundation models, so they're hooking into APIs and doing their thing, and this is for the inference part, not for pretraining. Right? Latency is gonna be important because you're talking about, like, agents talking to other agents and talking to other agents in order to to do your workflow. And so I think that it will be it will be, or it is already, important keeping latency down, not just to produce the result of your training model, but to to, you know, go through this process of delivering your result to the next agent that's doing something in your whatever workflow you're doing.

The more complex, I'm sure, the more critical that is. So, that's just a theory. That's just an idea. We'll see how that pans out.

But, you know and you brought up, you know, the the real estate component. That's where I was kinda going with it as far as the Northern European countries. Like, you know, that's all cool, but, like, do they have the space to do it like we do in the United States and in other parts of the world? I don't know.

Yeah.

So moving on to upcoming events. We don't have that many for you because it is the end of June, and we're going into the summer months. We are in the summer months. But we do have a few for you.

So first, we have the Ohio networking user group, part of the USN UA in Cleveland on July tenth. That is actually being hosted by, I believe, Jason Ginnert, who was a recent, guest on Telemetry Now, our main show. So if you're in that area, go check it out. We have the AWS summit in New York City.

I love how AWS calls these, like, their small regional events, and there's, like, ten, twelve thousand people there.

But the AWS Summit in New York City Smallest relative.

Yeah. Right. In, July sixteenth, I believe that's at the Javits Center. It's a great place to go see a thing. And if the weather's nice, it's nice to walk around that part of the city.

And the last one on our list is the Florida network and user group, again, part of the USNUA in Jacksonville on July seventeenth. Now there are a ton a ton of events, a ton, yeah, a ton of events going on into August, September, and then into the fall, like a long list. Kept it to, what's in the immediate future, but over the next few weeks, we're gonna start kinda going through that and maybe talking about what the important events to pick and choose. We don't have unlimited budgets.

Right? And I'd like to do that a little bit with you, Justin, since we do so many of these. But for now, thanks so much for listening. Those are the headlines for today.

Bye bye.

About Telemetry Now

Do you dread forgetting to use the “add” command on a trunk port? Do you grit your teeth when the coffee maker isn't working, and everyone says, “It’s the network’s fault?” Do you like to blame DNS for everything because you know deep down, in the bottom of your heart, it probably is DNS? Well, you're in the right place! Telemetry Now is the podcast for you! Tune in and let the packets wash over you as host Phil Gervasi and his expert guests talk networking, network engineering and related careers, emerging technologies, and more.
We use cookies to deliver our services.
By using our website, you agree to the use of cookies as described in our Privacy Policy.