More episodes
Telemetry Now  |  Season 1 - Episode 22  |  August 22, 2023

Demystifying the role of AI and large language models in networking

Play now

 

There’s probably no bigger buzzword right now than artificial intelligence. But what is it really, and how is it being used in the field of networking? in this episode, Ryan Booth joins us to demystify what artificial intelligence is really all about and what it means for network engineers.

Transcript

There's probably no bigger buzzword right now than artificial intelligence.

Alright. Well, that's two words, but still It's such a hot topic in the tech world and also even in popular media right now. So what I wanna do today in this episode is demystify what artificial intelligence is really all about, but specifically what it means for networking.

So with me today is a personal friend of mine Ryan Booth who has years of experience as a more traditional network engineer, a pretty high level network engineer holding a CCIE among other but for the last six or seven years has been laser focused on the dev ops and the software side of the house.

And more recently, Ryan's work and experience has brought him squarely in the center of this conversation using AI technology and networking. So we'll be defining some terms, dispelling some myths, and hopefully shedding some of the marketing fluff around AI and getting to the heart of the technology itself.

My name is Philip Jervasi, and this is telemetry now.

Hey, Ryan. It's really good to have you, man. You and I have talked many times and we've known each other for years. I really appreciate your background in networking, and I've been following what you're doing now, in this space as far as AI.

And, really your kind of transition as far as your career into the more dev ops y world, I guess, you could say. So thanks for joining the podcast today. And, as we get going and before we dive in deep, into our topic, can you give us a little bit of a background at yourself? I mean, I I I kinda spill the beans a little everybody that, yes, you have a background in networking, and now you're working in more of an AI space.

But from from you, what are you up to these days?

Yeah. No. I appreciate it. It's I'm I'm always excited to have this this conversation right now as it's a very hot topic and one, I'm building a lot of passion through. So like you said, my my core background's been in networking and infrastructure.

I've been doing that for the better part of I did fifteen plus years.

Worked my way up to CCI, worked for various companies, various organizations, saw all sorts of different types of networking, went into the vendor space, mostly with a deep interest into, network dev ops, doing network automation, and working through that, I kinda built more of a passion around, development and working through the development skill set as a software engineer.

I ran that for quite a while up until right now, and that's currently where, I earn a paycheck is doing managing a software development team, doing web applications for our systems.

And Recently, over the past year, year and a half, somewhere in there, I I really started picking up on, AI and ML Mostly, it started from a personal standpoint, with a lot of the AI art stuff out there. That's kinda where I I got my hooks into it and really got excited about it. And then when Chat GTP landed and the the whole AI craze hit, it kinda just accelerated it. And as I started digging more and more, I found avenues inside of my current employer to to pursue those, and I've kinda just been working through that now. And so that's that's kinda where my career has progressed, and kinda where it's going. So, yeah.

So then I I'd I'd really like to know why though you went from the networking space towards a, in a DevOps direction. What was the impetus to do that? Because I know I remember when everybody was getting into network automation for the first time, and that conversation was happening out there in the community. What was the reason that you chose to kinda shift from traditional networking, configuring routers and switches and turning a wrench into automation and then and the dev the dev ops world.

Yeah. So, yeah, that's a good question. It was kinda twofold.

I I would blame my CCIE and the studies for my CCA CCIE on a lot of it.

Okay. I agree. I agree with that.

Repetitive typing of rip commands into a command line and building router config, BGP config over and over and over and over Right.

Right. Really wore me out. And then also, that was around the time when SDN was picking up a lot of traction.

Or at least a lot of market buzz is the better way to say it. Right. And I I through a number of vendors, through a number of, social events, things like that. I I recognize that there's there should be a better way to do all this.

And and that's where I was like, okay, we we need to add program programmatic interfaces to network. We need to be able to automate instead of manually typing these commands.

And that's kinda where it started. And as things progressed over the next few years, and this was probably about twenty fourteen twenty fifteen, somewhere around in there. You know, as things progressed, configuration was getting more and more complex every single day, you know, you have MPLS. It's always kinda been out there and the complexities with it, but then EVPN hit for the data center and the complexity with that kinda just went up as well, and it's like, okay, we can't keep doing this manually.

We we have to introduce some sort of programmatic way to handle this at scale. And so that's that's where a lot of my passion shifted into. I don't wanna configure a router through the CLI anymore.

And do you think that the more recent shift from, you know, network automation DevOps into a focus on AI is kind of the logical progression of that of that change?

I think so.

It it's I I think it'll be a a decent jump, but it's kinda hard to say.

But I I do.

You gotta AI is just not gonna sit down at your chair and take over your job and and do things manually for you.

Oh, thank goodness.

It's gotta be built into a workflow.

Right.

You know, you you get we gotta figure out how to handle it, and all of this stuff is still way up in the air and how it's done. But it's gonna be done programmatically. That's that's for sure. And so to be able to work on infrastructure and to work on network kit, you've gotta be able to do it programmatically.

So that's kinda the spirit, the underlying premise here of everything is that the the real benefit of the shift from traditional networking into the the DevOps mindset network automation and now into the application of the AI concepts and workflows and networking is to add a programmatic element to actual network operations to the mundane tasks of running a network troubleshooting, fixing, learning, what's going on, if you're looking at it from a visibility perspective, it's really, operations focus. Right? It it I mean, don't wanna downplay the importance here. That that's a very important thing. We're talking about application delivery and the applications that run my hospital and run the United States military and run mundane things like my productivity tools, like, you know, Word and PowerPoint that I have online these days.

So as much as as much as, you know, we look at these shifts as we just wanna make configuring devices easier, it's really solving a network operations problem. It's not like magic. Like, I I don't like it when people say, look at what we do. It's magic and, you know, it that that's silly.

This is technology. It's It's code and and, specific database choices. It's specific technology choices in order to to solve a specific problem. And in this case, I really feel like it's It's an operations problem more than anything else.

Do you agree or am I wrong?

Yeah. No. I I absolutely agree there. It's it's it's totally operational.

You know, you if if if you look at anything out there, the the design and deployment, is is a smaller percentage of you know, the life cycle of a of an infrastructure.

So, you know, the there are places there to be able to, you know, improve and to automate.

But operations is where we need that focus, and we need that effort.

So, yeah, totally agree there.

Okay. So I was talking to one of our data scientists months ago, actually, at Kentech. So internally, and we were chatting about his back around. And he he made the comment that, you know, he he's not in networking or rather he is now.

He wasn't in networking. So he's got a PhD in science and, with a focus on machine learning, among other things. And he was working in the, some kind of an aircraft aircraft sharing industry. I don't know if it was manufacturing or not.

But in any case, what they were doing was collecting an incredible amount of telemetry information data from the various systems of of a particular aircraft. And then they were analyzing that and applying, advanced analysis workflows. I'm gonna call it that and not use AI AI and ML just yet. Right?

They were using these data analysis workflows to figure out what was wrong, what elements in their various visibility tools were correlated to each other. Ultimately for the purpose of sending out a tech of sending out a human being to go the problem. Now we know what the root cause is, let's go fix it rather than let's go mess around with wires and and and widgets for hours and weeks. We need the aircraft back online right now.

So let's use this tool to send out a human.

And I really feel like that's what we're doing now. We're we're applying these more modern data analysis workflows. When I say modern, by the way, sometimes I wonder if they really are that modern because we've been using them in other industries for a long time. But only only more recently in network, which I wanna I wanna touch on that later. But we're we're applying these new workflows new to our industry at least in order to send out an engineer in order to get a human being to fix the problem faster.

But this he he doesn't in that in that conversation, he didn't use the term AI even once. He didn't say artificial intelligence even once. So for me, this begs the question. We're talking to you, Ryan. What what is artificial intelligence? Is it different than machine learning? Is it different than just kind of like a college level statistical analysis, is it just a buzzword, or is it something unique that we can point to to that says, this thing over here, yes, this is AI.

Yeah. No. I think it's a I I think this is a a very important topic for us to to discuss, especially in in in the modern day right now because there's a lot of buzz.

There's a lot of stuff going on, and and there's a lot of misconception here.

And I I like to throw the analogy out there that we've seen in the industry a number of times. We we get these buzzwords. We get these cool new technologies, and everybody instantly, if you relate it to cars, and the different style of cars and the different you know, aspects and problems that the car solve or what they provide features, you know, everybody wants to instantly jump to a Ferrari. Or Lamborghini.

You know, that is what's gonna solve all of our problems. And we don't realize that what we have and the problem we're trying to solve can possibly be solved by a Toyota Toyota Corolla. Mhmm. And that's how I see it here.

You know, you you have these large language models. You you have chat GTP you you have, even deep learning stuff that goes way more advanced in as well. A lot of problems, if you don't necessarily need to go that far, you can use just basic machine learning with basic algorithm algorithms.

So I I I kinda break everything down into three sections right now.

And the the first one being the basic machine learning.

The stuff that's kinda been around for a very long time, we're we're talking the eighties or or further back. And, you know, it's it's it's a model that has very few layers. One, maybe two.

It's it's it's simple algorithms. Well, relative to AI and ML.

It's it's simple algorithms like linear regression, and then just pushing that through a few layers doing, you know, basic what's called forward pop propagation and backwards propagation for for your training and and your fitting of your model.

And so these these these types of models are usually very, very focused on a given task. They're not generalized, what, like, what you're seeing with chat GPT.

They have one specific job and they do it well. And so that's that's where ML comes in pretty good. And that's what's been around for a while there.

After that, you you kinda get a you You basically don't change up a lot of what you're doing. You just introduce new layers, and you go a little bit deeper with the same type of stuff.

And how you stack those together is kinda where deep learning comes into play. So, you know, if if you have multiple layers inside of an ML model, and you you replicate that however many times you need to, that's when you start getting into deep learning.

So then I see, artificial intelligence is a kind of a broad category. It is the idea of saying, let's teach these systems or rather create these systems to think like a human.

And and it's that broad. Like, it doesn't really mean that much more than that. It's Yeah. It's, the programmatic computers or rather it is computers that we program to think like a human being.

And machine learning is a technical component of how we do that. So we're talking about training the model with with data. However, and there's different ways to do that. I get it.

And then the application of models to datasets to then derive some sort of insight. So machine learning in that sense is a technical function of the broader category of artificial intelligence. Right? Yep. Okay.

Yeah. And and and and I would even argue as far as it's, you know, it's it's the foundation of it all.

And and that's, myself personally, as I've been ramping up here, it's it's it's where I've focused all my energy.

Is is is really understanding, email, the various models going on and then digging into deep learning, because I I think that's the core of all of this.

Well, it's you know, you look at these, these models and how they're built and how the LMS handle stuff, you know, it, it basically is that at its foundation, but just done at a very large scale Okay.

So then, machine learning, which which has been around for a long time. It's been around for decades and decades.

Why do you think that we're only starting to apply these types of data analysis workflows to the networking industry today or at least in recent years. Maybe not today as in literally two thousand twenty three. It's been a few years, but recently.

Yeah. You know, that's a good question.

It's I think our the the networking industry and the infrastructure industry kind of always been a little slower to adopt newer technologies, especially networking. Security seems to kinda be the same way, in my opinion.

Server infrastructure has, you know, they went through their virtualization and then docker, containerization, stuff like that.

But to really take this type of stuff and and start applying it, I think companies have tried.

But I I think it it really the the ML buzzword for machine learning. I I think really didn't take off until twenty fifteen or so when when a couple of things progressed.

And then most recently with with generative AI, and and LLMs, you know, that's that's been even later. Twenty sixteen, twenty seventeen.

And so I think that's, you know, you have various companies out there, and I think most large vendors and most most companies that are working in the space have dabbled in it or tried with various products to get there, with mixed results. So I I think it's been there, but I the marketing hype really just hit, you know, just over the past few years.

Yeah. Yeah. I remember the marketing hype around other buzzwords over the years. Right? Remember around the time when you and I first met everybody was talking about SDN.

And it and even then it meant it meant very little. What is software defined network looking. And he it's funny because here we are. I don't know when that was.

That was probably eight years ago when we met.

Nobody uses that term anymore. Right. It's gone from the zeitgeist and from, you know, the the narrative in our community. We don't even talk about it.

Maybe because it's become literally integrated into the the concept of SDN has been so integrated in so many of the tools that we use that maybe it's just ubiquitous. And therefore, we don't talk about it. But I think we don't talk about it because it was mostly just a buzzword. And that's why I have that kind of bad taste in my mouth when I hear the term AI thrown around so flippantly and loosely.

Because I know from experience, I I know, for example, why my company apply certain ML models and why we don't apply certain ML models. It's it's to solve a problem. So Right. You know, for example, you know, we are, looking to find seasonality in network data.

And we apply a model and it's way off, and it makes no sense. So we don't use it. And maybe instead of using a more complex, algorithm, we can do something much more simple akin to what you would learn as a junior in college. Right?

But lo and behold, it gives us the answer that we want. So I really see the technical components in modern data analysis workflows as tools in our tool belt. We use something when it makes sense. Like, we would use any tool in our tool belt when it makes sense, and then we don't when it doesn't.

So we're trying to forecast or detect anomalies. Right. So we're trying to identify. Here's something that, you know, we we struggle with in the industry.

I know. But, is identifying dependencies among short, short term dependencies and long term dependencies. Right? So we have this causal relationship in the data where can say, Hey, look, this thing over here is causing this, manifestation in the network over here, but it's a short term dependency.

Those things don't even, like, a CNI, right, on a container. Doesn't even exist for very long. So it's not actually gonna cause that for this long term, time period, whereas a configuration might be a long term dependency. Right?

Because that's something that's more static. So how do you encapsulate that into algorithms? How do you encapsulate that into literal math that lives in like Python that lives in Jupiter notebooks that lives in whatever kind of database that you're using. That's tough.

And and so I really look at all of these components as tools to to to help a human being engineer solve a problem. You know, I found I found that just choosing the right kind of database, so you can query databases fast, right, to query all that data just fast so you can actually use it, you know, when you're trying to troubleshoot wire applications things. Right?

That in and of itself is is a a great step forward. And, yes, that's part of the overall picture here because, you know, it's it's part and parcel of how do we how do we ingest data? How do we query data as part of like a data analysis workflow? I get it.

But but think about it. Just which are we using a relational database or a plumber database? And and what is the benefit and the the drawback of those? So I think there's so There's so much more than just we use machine learning.

I I remember seeing a, I don't remember who it was, some company, they had an ML button in their UI in their in their menu. You know what I mean? It was a screenshot I saw. I think it was on LinkedIn or maybe a YouTube video.

And I just remember thinking, I paused it and I'm looking at it, and I'm like, What what does that do?

So I click that button, the machine learning, but really? Like that makes no sense. To me, it's an underpinning function that produces a result. And it's like, alright, you got an alert that there's this problem, or you get this, message from the system, if you're using some of sort of chat ops maybe, and it says, hey, you have this, increasing cost in your AWS egress over here, and we believe that the likely cause is because you shifted from data center a to this data center B on this part of the world.

That that's like insight derived from data. To me, that that's machine learning happening. It's not a button that I press. You know what I mean?

Yep.

And and and I I think it, you know, I think I think it it starts there.

Those those those are where a lot of problems can be solved, and that's where a lot of problems should be solved.

You know, there's there's definitely been talk around the industry exactly like you said, that, you know, let's let's get to a point where we can have a a single application do this root cause analysis for us, or we we can have this central dashboard that'll show us absolutely everything we need to know, and we just click buttons to solve it.

And I think where that's always fallen short is being able to tie the people together who who know how to make those elements correlate with the people that know how to build the system and build the algorithms and then build the ML system to put together, and I think that's where as an industry, we've we've struggled for quite a while.

But it's it's also massively complex. You you take one one given instance, say, like, of an an interface flapping, for an example, knowing, you know, that that an interface is actually flapping between two nodes? Is it an s f s s a is it an optics issue? You know, is is the CPU on the router overloaded?

Is the cable bad? What what is it? And just a simple thing like that to actually break down and have a system be able to correlate and figure that out. That that gets complex.

And that's one of the simpler issues.

Yeah.

And so putting all that together, it it becomes a massive ordeal. And where you can take something like ML, or even deep learning, models is is be able to pump that data into it, and and let the model actually recognize with with normal traffic flow coming in and out at constant times where it's seeing issues and where things crop up.

I think where that gets complicated, I I obviously have not, you know, worked on this hands on directly, so I'm just kinda going from a theoretical standpoint.

Mhmm.

But where that gets complicated is you have to be able to identify your various channels and identify the data that actually matters and at the various layers of the model or or what gets inputted in into the beginning stages of the model and and where where it learns from there. And then it's a matter of training and retraining to go through it. That's where it could really help out because we don't have to manually correlate all this stuff together for every single possible issue that could be out there. Let let the models do their job and find stuff for us.

Yeah. Yeah. And and I I have actually in talking to some data scientists. Both it can take in other places.

That's that's actually easier said than the or rather easier. Yeah. Easier said than done. That's the phrase because it's actually easy to find correlation.

It's not hard. And, the the the difficulty comes in when you find things that are correlated, but who cares? It's how do you add the subjective component, the the the human part of engineering?

The idea is, okay, these things are related, but that doesn't affect end user experience in my New York office. It's like, okay? Well, then do I care? And if they are correlated, are they correlated as a spurious correlation, or is it a causal relationship? Is a third variable not at play or at play here that we're not identifying?

So I've seen that, I've seen that correlation is not difficult to identify.

You know, it's just assigning a correlation coefficient based in math. It's not hard.

But having meaningful correlation or finding meaningful correlation, it's it's different. It's different. Right. Because again, it's it's the idea of we have very dynamic, not static networks.

We have ephemeral information. If we start getting into containers and end users possibly, if you wanna collect information from there, we're talking about very, very divergent data So if we're looking at, analyzing data, it's not just, like, in the health care industry, just analyzing, like, information from MRI. Right? And it's all similar data.

And then you're trying to figure out and forecast some sort of. That I've I've seen that, and that makes sense. But when you're looking at network data, you're looking at very, very diverse types of telemetry that are incredibly different scales and formats and types, and and represent very different things. And a lot of it's subjective.

It's not subjective. A lot of it's, not quantitative in the sense that it's a tag.

It's a label. It doesn't represent any particular activity. It's like a security tag or an application tag or a user ID. Process ID on a container.

So how do you fit that in into your algorithm? So I I I I think that I've always thought that that's one of the reasons that it's been a little bit slower uptake because we in the networking industry is because we have that difficulty to overcome. But you mentioned chat GPT a few times already. You mentioned, LLMs a couple times already.

What does that have to do with anything? What what first of all, what is this an LM? And and, what can I solve with that in the networking space?

Yeah. So LLMs and basically the, the the invention of what's called transformers.

If if anybody's heard that buzz word out there. It's not a buzz word. It's the technology actually used.

Is is you know, if if if you take a deep learning model that's that's that's a large number of layers, and then you take that and you make it much, much larger with a much larger corpus of data, collection of data, that is very generalized.

And and you push it through a transformer model so so basically, if if you're looking at deep learning, and you're so if if we're talking LLMs and we're talking chat GDP, I'll connect that route. This is usually with text generation.

Okay?

And text prediction. So what's what's what's this sentence or respond to this, this question that I have. Deep learning is one of those when you're looking at, some of the examples out there, like the older models RNNs and, what's called LSTM, which is how our RNNs are built, is you pump in a handful of words and it guesses the best next word after that. And so the best way to, to to visualize that is if you think about going to Google and you type in a search, It has a drop down that suggests the next words to use. Same thing with on your phone when you're texting, it gives you text predictions, but it only predicts the next word.

It only predicts, you know, the next two or three words that you might be using. It doesn't take the entire sentence under context and respond in full. And that's whereLLMs really took a step forward, is the transformer model allowed you to input a larger set of text, one or two sentences, or even more, and get a full response back, not just the next word.

And so what that opens up that opened up a lot more of context for you because, you know, the that's that's kinda the key here is, like, if you're just predicting the next word, you don't know the context to the past three words back. Okay. And with a transformer and with, GPT models, like GPT three, barred, Lama, anthrop anthropic's model that are open.

All of those are where you can actually start interacting with it. And that's what we're seeing now. And so that's when the, LLCs really started coming into play, and that's that's where we are right now with the generative side of it.

Yeah. And I have a subscription to chat GPT. I use it pretty frequently. It's very interesting. I have experimented with it to see how I can kinda break it. I say break it in air quotes what kind of responses I can elicit.

To me, it's not hyper useful because of what I do for a living, so I don't necessarily need it in in to succeed at my job necessarily.

But how do you think that chat GPU or at least similar type of technologies what role do they play in networking or will they play? Do you predict?

Yeah.

There's it it it goes a number of different ways.

From what I've explored right now, and I I'll say the absolute most important response to that question right now is we don't know.

Okay.

We we have guesses.

Some of us and a lot of us have started playing with it and started exploring and figuring out where we can and can't use it.

But for the most part, where it's gonna land is really hard to tell right now. It's gonna be a very integral part of our jobs.

But we just don't know how yet.

So you you mentioned earlier you have a chat GTP subscription I do as well. I I highly recommend anybody out there. I highly recommend if you can get that subscription beg your boss, get them to pay for it, or any of the others that are out there, and and really start using them. Just figuring out and exploring how to use them because that's how we're gonna figure this out. It's just by doing, just by trying experimenting playing with, seeing what works, seeing what doesn't.

But for right now, the the roles that I see at play is the the the first one right out of the gate, especially with the LMS and and GPT models is it's it's gonna improve our interaction with programs, with applications, with devices by introducing natural language communication.

And so you'll you'll see it, out in the industry termed as NLP or natural language processing, which is a whole group in and of itself.

But being able to actually use your natural language in in whatever language you speak as well and interact with those devices. That's that's gonna be the critical part.

And that's that's gonna be what comes out of it. Now how it does that, that still needs to be figured out. So instead having to know a specific CLI, set up having to know a specific programming language or, you know, how what operating system you're working with, that stuff should get smoothed over and you just talk with the natural language to what you want.

And I remember, that was one of the first things that I did experiment thing with ChatGPT when it came out a while back was telling it to configure a thing. You know, I would give it parameters, and then it would, spit out a configuration in, know, Cisco CLI or Juniper, whatever. I was I was, asking it to do. And it was okay.

Mhmm. So you think that that's kinda where we're going as a as a first major step is is using that technology as a means to make managing, configuring, interacting with our individual devices, and then, therefore, networks as a whole more, just easier. Yeah. Yeah.

Yeah. And I I think it goes across the board. So you're, you know, instead of having to sit down and spend a week or two or however long it does, it takes to to build the config for a new router or a new firewall or a new switch or even a new server.

You can very quickly just on a chat interface say, Hey, I need this, this, this, and this, build it out in a bullet point list, send it into it, and get configuration back And if it's not perfect, especially like right now, a a lot of what you see out there isn't perfect. If you get probably seventy five to ninety percent accuracy, if you're lucky, but that cuts down a lot of work that you have to do. And so now you can do that last ten to fifteen percent of tweaking and you saved yourself x number of hours of time.

The other area that I feel this is this is gonna be pretty universal kinda, as I I think it'll have the same impact that the internet had on everybody is you will now have an expert in your pocket at all times.

For basically almost anything.

And we're seeing this all over the place. So, you know, we've, we've always joked about in the industry, especially with automation network automation is is, you know, you're at any given time, an end user's gonna be able to click a button and provide the same operations as a CCI would, or get the same level of configuration or you know, exposure to the infrastructure as a CCI would give you, but you you have it in front of you. And that's also what I think we have here, And so through through my experimentations and through the stuff I played with, you know, I I can go into these tools. I can go into chatgy P or go into any of the other models that are that are more fine tuned for something. And I can ask, how do you do solve this specific problem in Python?

And then I get, you know, their response and it's like, okay. The you do this, you do this, and do this. I wasn't thinking about how you could do it that way. Okay. Well, now how do I do that in CIDR how do I do it in go? And then it'll turn around and give it to you and go. So instead of having to ramp up and learn all that information, or know how to get that information off the internet or off your team, it's just right there at your fingertips.

And that will just continue to improve. And I I think it's it's opening up the doors for a lot more people.

I think, one of the one of the things that I'd like to see happen is more than just the interface with our devices and it being able to spit back information about our devices or at least us configuration and help us, like you said, solve problems in that way, is the integration with the actual data of our networks. Right? So that way, hey, whatever we name it. Like, like, lieutenant commander, Jordie La Forge, talking to the enterprise.

Right? Computer. Why is my Chicago office operating very slowly, and then it looks at the data, and then it's using artificial intelligence and machine learning and applying models probably already has Hopefully, there's an alerting system where it tells you before you ask it. Right?

But in any case, that it says, ah, we identify that there is a slowness on this particular interface, and it's likely caused by this thing that's happening over here, DNS resolution times, and Route fifty three is really slow with this particular file. Whatever. Something that, yeah, we could have figured out as engineers. It's not like the computers are necessarily quote unquote smarter than us.

They're just doing everything at scale much faster than us. This is why I say, maybe you disagree with this. This is why I say, if I could afford, like, a team of a few hundred PhDs from, like, MIT or something like that, I could probably just say, yeah, you are my, like, human chat GPT. Like, you just analyze the data, do all your stuff and be in that in that room doing it for me.

But instead I have computers I can do it dramatically faster and at larger scales. Mhmm. Ultimately, hopefully, being able to derive insight that I wouldn't have otherwise been able to do. So I I would love to see that integration with the also the ability have that human language component that we've been talking about just now and the ability for it to spit back answers and spit back configuration.

And maybe one day, then we follow-up with, okay, that sounds good. Push that configuration to my North America offices or something like that. I don't know.

That's kind of I I have to say we we started off talking about how SDN was buzzword stuff, but I have to admit in my mind's eye, that's kind of what SDN was always supposed be.

You know what I mean?

Yeah.

And then I I I think so. I think the, you know, everybody's talked SDN to death, but I I think what we'll we'll go through similar steps and motions with AI that we did there, and that SDN was supposed to just, kinda, be smoothed out for the end user. It it was supposed to just be easy.

And while stuff did come out of it, you got, you know, you you got these these automation tools out there that build infrastructure for you automatically.

You you have stuff like, you know, MPLS at the WAN Edge. You you got technologies like SD WAN. And they were basically the the right step for SDN, just not how the industry looked at it. And SD WAN didn't come in and say, Hey, everybody's gotta learn to do this complex MPLS configuration for your WAN, No. You plug in this box and you click these five buttons, and you're done. And you have this really complex config, but it's smoothed out and simplified for you. I I think we get the same thing here.

You know, we've we've made attempts in the past with multiple vendors and We we build these big telemetry boxes or we go through this really complex data aggregation where we pull all the data in there, and it's all at your fingertips, but the problem is you have to know the syntax to query it. Yeah. And you have to know how to do that effectively.

And there's only a handful of people out there that know how to do that effectively and know how to interpret the data.

So we we are seeing stuff come and surface over the past year. So where companies are utilizing the newer LL limbs that are utilizing the newer technologies like NLP to to be able to simplify and allow anybody to come in and ask that question and and get reasonable data back. And then that's the other part with it with NLP. So you can you can interpret the the question I'm asking into the complex, into the complex queries, so that's natural language understanding.

The reverse side of that is you get this massive, massive amount of data that comes back at you. And you have to also understand how to interpret that. Well, part of NLP is, NLG, or natural language generation.

So your LMS, your solutions, they actually call or, they shrink all that down into a nice summary for you.

So instead of getting this large, complex set of data back that you gotta interpret or logs and events, you you get a simple response, like, oh, no. It looks like interface XE's zero zero zero one is having issues with with packet drops. You you get it boiled down to a simple response. I I think that is where it's crucial and where it's gonna simplify stuff for a lot of people.

Oh, yeah. I mean, you if I had that tool when I was troubleshooting networks and trying to figure out problems where I could literally just ask the network. Right? That's that would be amazing. And if that's where we're headed, that's great. But what are so what are your experiences then working with JetGBT in your experiment I mean, you don't have to get into what you're doing at work necessarily, but what what's I know that you, you've built some applications with ChatGPT. What are your experiences there with that?

So, first and foremost, the one I love the absolute most is the the repetitive task. So if I'm going in and and right now as a software engineer, I need to build out, a feature in my application.

And that feature has all these different components that have to be built out, and then there's this whole list of stuff that needs to be done. It's pretty cookie cutter. And in the past, it was basically just manually type all this stuff out and and just grind through it, and that was part of a software engineer's job. And we see that a lot in network automation with the various, you know, config files we gotta build or the various modules that have to be built to handle our infrastructure, Yada Yada, I can just quickly go to chatGP and say, hey, build this, do this for me, and give it the specifics I need and it it spits it out. And so it's then just copy and paste into the code base, update it as I need to, and then move forward and do the complex work myself.

So that's a massive time savings there.

And then it also gets into what I mentioned earlier with the stuff that it's like, well, how would I solve this problem? If I gotta do this, this, and this, how do I build that, or what's the most robust way to do it? I can turn around and very quickly ask ChadGP. Hey, how would you do this?

And take what they say and be like, Oh, okay. Okay. They're doing this, this, this, and this, and then you can take it from there. And so you get that expert opinion or at least the starting point to move forward.

So to me, those are the two biggest ones.

You know, I I love to use it to type em, emails, especially general emails that happen all the time, or there's a lot of emails that that I I just don't enjoy typing because I I struggle with how to word it. And so it's like, alright, chat you to p. How are you gonna word this email?

Okay. That's way better than what I would say.

Thank you.

You know, we're tweaking. And so that that helps me stay focused on, you know, my day to day.

Another one that was really cool that my team lead actually did. I won't take credit for this one, but I thought this one was awesome. We we were in the middle of a planning session for our next sprint for the next couple weeks of of work that our team was gonna be doing, and we were trying to scope out a feature that we were gonna be building and it was one that we wanted to add in, but we didn't know how much time it was really gonna take or how complex it was gonna be. So we really couldn't, we couldn't guess the number of hours or estimate the number of hours it was gonna take.

So we couldn't tell if we were gonna fit it in the sprint or not. And so what my team lead did, he he shares his screen while he's doing all of this. He goes up. He asked chat GDP.

How do you build this feature? How do you do this specific thing? It spit it out. He had it tweak it a little, and as he's seeing the code get built on the screen, the whole team's watching as well.

Mhmm. And so we can see, okay, if we're building it for ourselves, we can visualize everything that needs to happen. And it's like, oh, we didn't think about this part, but Chat GDP caught it. It's like, Hey, you gotta build this out.

And they're like, okay, then we can better scope this, and then we were able to say, Hey, it's probably gonna take a week.

And so that was another cool feature that we were able to just use it on the side. Now those are more admin feature type things. So those are kinda more boring.

Boring, but but needful. I really feel like what we've been talking about because the broader topic of this conversation today is AI, which ended up being about ML.

But that the the large language models, specifically, in this case, Chat, GPT are or is a interface between a human being and then all that AI that's happening. And so now we have a easy way to, get access to the data, not just the data, but the, the meaning in the data, the insight from the data, and manipulate it on the fly, and then get answers very fast. Get information very fast. And then, you know, we're talking about developing config and all that stuff as well, and I get it. But what are some of the problems then? With this interface. I mean, did everything go perfectly according to plan in your development of these applications and then in your your your team lead in developing that application?

Yeah.

No. And then it doesn't.

You know, I I think we're we're a decent ways away from getting total a hundred percent accuracy on anything.

But also well, so that's that's a lot of the misconception. And let me let me let me call that out right now.

Sure.

So one of the core understandings of data science and ML and AI is that there's a certain point that you can't get any better than a human could.

So if if a human is able to do a job at ninety five percent accurate, see. It's very hard to say in a lot of situations that a machine model, an ML model, an AI model can do any better than that. Really? Now there are applications where it can, but there's a very small gap between how much better it can do it than a human. Now there are exceptions, and there there there are, you know, exceptions to that rule, but in general, We shouldn't be expecting it to do much better than a human in a lot of scenarios. So if you are building config and a human can only do it at ninety eight percent accuracy, probably not even that good. You know, if if you get that close, you're you're in the you're in a win scenario.

The in my experience, I found that knowing how knowledge of of what you're trying to build and knowledge of what you're asking, the AI or ML system to provide you as critical.

So you get much better results if you are able to be more detailed in the the prompts that you give So people have started talking as prompt engineering as a as a career.

And I don't know if I'd go as far as that, but it will play a very critical role, in all of this, being able to provide very good prompts.

With that, I've I've learned. So so I did this project, that I'm I'm kinda just playing around with going forward And I was like, I'm I'm just gonna have ChatGP build me a full containerized web application that is built on Flash and Python and Yada yada. And and have it deploy it into a cloud environment or Kubernetes or whatever. And I'm not gonna touch the config at all. I'm gonna have AI do everything. I'm specifically Chad GPT four.

And then, so what I did is I started walking through the project and it's like, okay. How would I build this out myself?

And so I started asking Chat GTP for, for it to build all the configs for me.

The first thing needed exactly was the knowledge of how the application is built so I can get the configs out of it.

Yeah.

Then what I found out later on, is that order of operations was pretty critical.

So, when I would start going through it, things would go smooth, configs would get updated, as I typed, you know, request for new configs in. But later on, when I had to jump back, to a specific config file or to a specific module in Python and say, Hey, I need you to update this to add this, this, this, and this. It had a harder time with that. And so the workflow and the order of operations as you go through it, I feel are more critical because the AI still gets lost in in some of that information.

So to me, those are kind of the areas that that you have to pay attention to.

And as we get more deeper into using it, and and we build these into our workflows.

I I think those are the areas we gotta focus on for accuracy.

Okay. And ultimately, like anything else in technology or really anything else that exists, everything is kind of iterative. Right? Yeah.

It it's I think it's amazing where we are and to suggest that it's no good because it doesn't, you know, it's not magic yet. Is silly. It's a huge step forward, and, and I appreciate it for what it is and look forward to what's coming down the road, which ultimately begs the question. It doesn't beg the question.

To me, this implies that you really don't need a PhD in data science or advanced understanding of machine learning or even a cursory understanding of of data analysis to be able to interact with the technology now that we have this human language model that can sit in between us as a human being at a keyboard, and and the actual, artificial intelligence workflow that happens under the hood. Right?

Yeah. I I agree with that.

I I learned when I did my transition from network engineer into software engineer.

That, you know, it's it's not one of those where I was gonna become a software engineer overnight, and it's, you know, it's it's you gotta understand and and appreciate that these these, data scientists and these engineers that build these models and build these systems. You know, they got PhDs in this stuff. They've been working in these industries for twenty plus years to do this, And for us to just jump in there last minute, you know, with a handful of years of experience, we're not getting the same quality as they are. And that has to be appreciated.

But they've as an end user, we've that's what we are. That's our focus is we use the products they build. And so we have to learn how to leverage them in those ways to, to your point. And it's become easier and easier. And every single year, it gets better and better and easier and easier for us to actually leverage these tools.

I think that was one of the bigger explosions well, the the explosion and the hype around, chat GTP and LLMs, didn't necessarily come from, open AI's awesome GPT models?

Okay. They were pretty standard, and they they're they're pretty good. They they hold the industry standard for performance now, but it wasn't necessarily that that model exists, and it was much better. It was how it was delivered to the users.

It gave everybody a very simple interface to play with it. You didn't have to understand all the bells and whistles. And that is where I think the the explosion in the industry happened is because of the simplicity there.

Okay. Yeah. Yeah. That makes sense because frankly, I am not that smart and I can log in to my chat GPT subscription and start doing productive things with it right away.

And, I am no data scientist. That's for sure. But I do look forward to how we apply this technology, not just the, the large language models and the natural language. What does the l what does the p stand for?

Natural language NLP. Processing. Processing. Thank you.

Not just that, but the underlying artificial intelligence workflow, the machine learning workflow that's adding that, insight to the entire process that we may not have otherwise had or that would be just insurmountable for us to attain on our own in a reasonable amount of time considering the amount of data that we have to look at now and the complexity of networking today. So anyway, Ryan, this has been really excellent. I'm looking forward to seeing more of what you are personally working on and of course just tracking your career as well because I know that's that's just really interesting. If I could have it to do over again, know, networking is great, and I love it as a as a industry and as a career, but I am so interested in this stuff.

To me, it's not just like the SDN FAD. It very real. And I can see not just because it's being used in networking, but you can see how this technology is used across the world in in a variety of industries and and has real value. It's very attractive to me. I would definitely look at that as a career option if I was, you know, eighteen years old graduating high school today.

Then I I would add to that too, you know, and and it's it's career career advice and take it as it is. You know, We at some point in time, we all have to kinda pick where we're gonna focus on and we kinda pick what we're gonna do when we grow up.

And a lot of us, I think we kinda get stuck in the rut that that's where we gotta stay.

And I've I've never really liked that myself. You can you can see that I bounced around a lot you know, I've I was given the advice a while back that we all get to kinda choose our own adventure.

And so I I I really take that to heart. And I, you know, I I would hope that everybody does that as well. There's a lot of movement that can happen in this industry, and you can follow those interests as much as you wanna just how much effort do you wanna put into it, and you just gotta be very open minded about it and go for it.

Yeah. Yeah. I agree. We we reserve the right to change our minds.

Yep.

And, I I add the caveat especially when I talk to my own kids I have one that's starting to look at colleges.

Say, I say to her, you have the recite of the the the you have the right to change your mind? About majors and career options, but whatever you do, you give it maximum effort. Don't play around, you know, dive into it and give it your all. And if you wanna change your mind five years later, that's fine.

I've done it. And it's worked out. Okay. Anyway, Ryan, it's really been great to have you today.

I really appreciate talking to you. I'd love to talk to you again and, get into the weeds, even more about maybe specific ML models and how we apply them and and why so if anyone has a question or a comment, how can they find you online?

So kinda Twitter and LinkedIn are the best ways to go.

Twitter's kind of whatever it is right now, and maybe it'll be gone by the time this is published. Who knows? But I'm, at that one guy underscore fifteen.

Most anywhere like Reddit LinkedIn, Twitter, all those various places that one guy fifteen is my handle.

But LinkedIn's probably the smartest way to get a hold of me.

Reach out. Anybody's welcome to hear me. Reach out and chat. I love to talk about any of this.

My GitHub out there, the the project that I talked about about building the web app with chat GPT four.

That's actually out on my GitHub page. I have the full context of the whole thing just kinda as experiment, and I continue to play around with it and push stuff out there so you're you're interested to check that out as well or contribute if you want. I don't care.

So, yeah, those those are probably the best ways to get a hold of me.

Great. Thanks.

And, you can find me on Twitter still, network underscore fill. You can search my name LinkedIn. And, if you have an idea for an episode or if you'd like to be a guest on the podcast, I'd love to hear from you. Just give me an email at to learn through now at kentech dot com. So, until next time. Thanks for listening. Bye bye.

About Telemetry Now

Do you dread forgetting to use the “add” command on a trunk port? Do you grit your teeth when the coffee maker isn't working, and everyone says, “It’s the network’s fault?” Do you like to blame DNS for everything because you know deep down, in the bottom of your heart, it probably is DNS? Well, you're in the right place! Telemetry Now is the podcast for you! Tune in and let the packets wash over you as host Phil Gervasi and his expert guests talk networking, network engineering and related careers, emerging technologies, and more.
View in Prod
We use cookies to deliver our services.
By using our website, you agree to the use of cookies as described in our Privacy Policy.