Elon Musk AI Policy, AI Collaboration, and Liquid AI
E23

Elon Musk AI Policy, AI Collaboration, and Liquid AI

Intro - Outro:

Welcome to wake up with AI, the podcast where human powered meets AI assisted. Join your hosts, Chris Carillon, Niko Lofakas, and George b Thomas as we dive deep into the world of artificial intelligence. From the latest AI news to cutting edge tools and skill sets, we are here to help business owners, marketers, and everyday individuals unlock their full potential with the power of AI. Let's get started.

Chris Carolan:

Good morning. Happy Wednesday, November 6, 2024. It's time to wake up with AI here with George and Nico. How you fellas doing this morning? Here.

Chris Carolan:

I'm awake. Indeed. Indeed. Actually, I'm I'm

George B. Thomas:

kind of in a good mood. I've been plugging away, getting work done, trying to keep up with the Slack. I think I start every session with, like, our Slack. I feel like people would maybe even just enjoy being in a Slack channel of information that they could pay attention to. We'd have to figure out how to curate that and what that would look like from a community standpoint.

George B. Thomas:

Hint. Hint. Think. Think. Future.

George B. Thomas:

Future. Okay. I'll shut up.

Chris Carolan:

Yeah. I'll pluck that one out of the the long list of Claude suggestions about how we can deliver more value with this show.

George B. Thomas:

Yes, sir.

Chris Carolan:

What do we got today, Niko? Speaking of that long list that hopefully everybody will be able to see someday.

Nico Lafakis:

So much is going on, yet somehow where you wouldn't think I would be able to stay on the topic of AI while dealing with some news. I'm here for you. Alright. Everybody knows the obvious, so I'll just get into what it means for the rest of us. What it means is that Elon Musk is going to now be in a very pivotal role within the government when it comes to AI.

Nico Lafakis:

Is this a good or bad thing? Could be good. Could be bad. I don't necessarily know what types of regulations he's got in mind. I do know that he's looking to try to slow things down.

Nico Lafakis:

Last time I checked, when he said he wanted to slow things down and sign petitions and got all these people to agree and say, yeah, it's a great idea, x a I popped up out of nowhere that week. So take his thoughts and speeches about AI regulation with the largest grains of salt that you can possibly find. I don't really think that it's going to be regulation that helps the community so much as attempts to help XAI get further ahead. And the only reason I say that is because I'm not looking at his AI policies. I'm looking at Musk's track record.

Nico Lafakis:

He did the exact same thing when it came to Tesla. A lot of government policy that was put in place specifically helped out Tesla, specifically helped them to be able to sell off carbon emission credits to other car companies to offset potential losses that they would have faced in their company. Right? So XAI is behind the ball, well behind the ball. Grok is really I mean, like, who cares?

Nico Lafakis:

And especially on on such a platform, who is signing up to a social media platform and paying for it in order to use an inferior model for what reason? I don't know. So yeah. I don't

George B. Thomas:

I sure do like them images over there, though. I'll throw that out there. It can generate some pretty dope images over there. By the way, I already have the account, so I paid for it, and I've used it. So there's this guy right here.

George B. Thomas:

Hi. That you just mentioned. Just throwing that out there.

Chris Carolan:

Yeah. You And just for context.

Nico Lafakis:

Instead of, GPT and and Claude, though.

George B. Thomas:

I mean, I use those too. That's the thing. Like, I use multiple tools.

Nico Lafakis:

Right? Using Grock on a daily basis.

George B. Thomas:

Not on a daily.

Chris Carolan:

So for context, I'm assuming you're speaking about Trump winning the election. Well, that's the other thing. I didn't

Nico Lafakis:

wanna say that. Knows, and

George B. Thomas:

I was like, what are we talking about right now? And it took me a minute to actually understand that we're talking about a presidential election. I think

Chris Carolan:

what you're inferring, Niko, is that what we saw during the first term is he appoints people like Elon Musk, who's large and in charge in various ways in the private sector to come in and manage public policy related to that specific private sector that person is profiting off of. So we don't need to get into too many of the details there, but I love bringing it up in terms of when it comes to AI regulation. Like, man. So the the person that said we're gonna 10000 x, like, over the next 4 years as likely to be very integral in either making sure that does or or does not happen for various reasons as he's in a in a position to to to create public policy around this topic. So definitely gonna be very interesting to see what happens there.

Chris Carolan:

And for me, like, it's like, okay. Get ready to go fast. And anybody who can run fast enough to keep up is probably gonna be able to leverage, you know, some some opportunities.

Nico Lafakis:

To be honest, so, like, I I do say these things about potential regulations coming in into play. I think if it's smart regulation, I think it'll go the same way that Eric Schmidt was talking about where just choke computation. At least then you can't wantonly accelerate because computation is just like a really easy way of acceleration, I suppose, but what it does do is spurns innovation. Right? So whenever we can't go faster, we'll just figure out a way of making what we've got more efficient so that we can go faster with what we've already got.

Nico Lafakis:

Right? If we can't build a bigger engine, we'll make a more efficient one. That's already kind of happening with a lot of the models. We're already starting to see that some models that are lower in parameters are actually performing better than models that are much much higher in parameters. There is the entire aspect of liquid AI that hasn't quite caught fire yet, and that's an insane change to everything.

Nico Lafakis:

Let me see if I could pull up

George B. Thomas:

I know that was in our Slack. I don't know if we've ever talked about Liquid AI yet.

Nico Lafakis:

I don't think that we have. Let me see if I can find

George B. Thomas:

which It was riveting, by the way. Like It is. I watched the the video, and I was like, okay.

Nico Lafakis:

So, like, what are the the big differences basically is, like, it can run all of the same functions as current AI systems, but it does it without needing nearly as many resources. Let me see if I can just I wanna try to pull up these scale that they had. So this was from a talk that they just gave recently to sort of show off, like, okay. What is the difference between liquid AI? Just not necessarily performance wise, but also in terms of resource because that's becoming the biggest challenge right now.

Nico Lafakis:

I think even California just slapped down Amazon of all companies in terms of whether or not they were going to be able to push the power output of a particular, nuclear power plant in California to be able to scale, basically.

Chris Carolan:

I'll have you look for that. I'll hit on some of these because it's we don't always get to everything that you put in the chat, and there's some other good stories in terms of, like, just these headlines in general. You can read in, like, AI powered drowning prevention system, like, enhances water safety. Like, there's so many applications like that that are gonna be overlooked where it's, like, literally making the world a better place, you know, in terms of, like, in that case, water safety. On the other side, like, how are private businesses approaching this?

Chris Carolan:

Disney forms new business unit to explore use of AI and emerging technologies. Right? So that's where we're not gonna stop talking about this. If every business like, I love what you said. I think it was on your podcast on Hub Heroes, George.

Chris Carolan:

You don't see you don't think of them as small businesses. It's just like a business that has an opportunity to grow. Right? That's where, like, if you would classify yourself, oh, I'm too small to create a AI, you know, unit. Like, no.

Chris Carolan:

You're not. Like, if even if it has to be just a unit, like, in your day as the leader of the org, create space to explore and, you know, understand the use of AI and, like, instead of cherry picking, like, oh, Disney is successful because of this, this, and this, so I'm gonna copy this, but not this other thing. Like, this is part of the reason they're successful. Like, this is a clear signal that they're gonna prioritize this, like, significantly. Like, companies of this size don't just create business units because they feel like it.

Nico Lafakis:

Right. Exactly. It's it's a very big and bold statement to the rest of the industry saying we're not backing away from this tech, right, as one of the leaders in the animation space, especially, and the entertainment space. If you if you're watching us, if you're on YouTube or you're on LinkedIn, I'm looking at a graph right now. It's basically showing a cross between the output length, so the number of token output length, and then that's being measured by the inference memory footprint.

Nico Lafakis:

What is inference memory footprint? And that's being measured in in gigabytes. It's essentially how much memory is it taking to get the length of that output. So how much energy and cost does it take to get there. Outside of not including, this is comparing Llama to PHY, which is Microsoft's model, to Zephyrra, which is a open source model, to Apple's AFM model, to Google's Gemini 2 platform.

Nico Lafakis:

How does it compare? When it comes to Llama and Microsoft's platforms, they're they're early renditions. They get to just over a 100,000 tokens, but they use right at, like, the 80 gigabyte and I'm sure much higher amount of memory. If you follow any of, like, the hardware side of this stuff, you'll know that they're they're taking they're building clusters, gigantic clusters of memory cards. What does this look like?

Nico Lafakis:

If you were to take and I'm talking about, like, when I say 128 clusters, I mean a 128 by a 128. And the cluster itself could be 12 by I don't know. Let's just say 10 by 10, so you're already talking about a 100 by 128. Like, it's it's a ridiculous number. They were having huge issue.

Nico Lafakis:

And so running all those GPUs, right, if you remember during the GPU craze during Bitcoin when that took off, takes a lot of power takes a lot of power to get those GPUs running. One of the biggest costs for Bitcoin miners is electricity to run the computers that are mining their coin. I should say cryptocurrency miners. Bitcoin is just one of them. Anyway, in other words, these other models are taking quite the amount of energy.

Nico Lafakis:

Google's model is coming in at the most efficient and, like, longest context window at 2 at 1,000,000 tokens. I think it's even higher. The new Gemini 2 model is 2,000,000. And the Gemini Pro model anyway is at 2,000,000 context window. And it's still getting close to the same like 80 gigabyte mark.

Nico Lafakis:

It's probably around 55 or so. The liquid AI model is easily soaring to a 1000000 tokens at 16 gigabytes. It's an exponent of energy savings. It's not even like, you can't even just straight, oh, okay. Well, it's 44 x less.

Nico Lafakis:

It's something with a remainder, and I'm not even sure what that comes out to in the next rendition. And why am I even bringing this up? Because this is how you get around regulations that are related to computation or you get around regulations that are related to hardware restriction as you just build a better platform that's more efficient.

Chris Carolan:

Two questions. The GPUs and the the memory clusters, is that why NVIDIA just overtook Apple to be the most, valuable company in the world?

Nico Lafakis:

Yes. They are the top producer of GPUs globally. It's a little bit of a tizzy because how this how this works out on a global scale in terms of hardware. Taiwan is the number one semiconductor producer in the world. Thankfully, they are not allied with anyone, but we protect them, basically, because China has been trying to take over Taiwan for quite some time now because it would just complete their, I don't know, master plan of technology, I suppose.

Nico Lafakis:

So we do what we can. We have been restricting just the, export of stuff, but the quality, the, like, like, the grade of export. So in other words, if we're getting b 100 to train on or something like that, you know, overseas they're getting, like, a 100 or something. Like, they're getting a lower level training hardware than we do. So, you know, purposefully trying to keep them behind, I suppose.

Nico Lafakis:

At any rate, some so Taiwan Semiconductor, company supplies NVIDIA and NVIDIA then turns around and turns out insane hardware like the new Blackwell chipset that I think is gonna start shipping in early January or February. Like, here's something insane to think about the fact that this type of stuff is really crazy to me. Google missteps with the initial Gemini release loses something like 700 or like 400. I have to go back and look at the number. It's ridiculous.

Nico Lafakis:

It's like 6 $100,000,000,000 in market cap. It was ridiculous. Right? In one day just because of a slip up. NVIDIA announces they have to go back to square 1 with Blackwell because of a flaw that they could they didn't foresee.

Nico Lafakis:

Nothing changed. Valuation stayed exact. I think it dropped maybe, like, percentage point. Like, nothing. Like, it was a drop in the bucket.

Nico Lafakis:

Why? Because Jensen said outright, we're using AI to design these things. So the bounce back was gonna take no time at all. The redesign was gonna take no time at all. And now they're they're back on schedule for shipping.

Nico Lafakis:

So they've you know, initially they said, yeah, this might push us back or whatever. Nope. Right back on track. It's like nothing ever happened. Intel is planning on building a chip factory.

Nico Lafakis:

IBM is working on building a chip factory. There's actually a couple different data centers and chip factories being built right here in Ohio, very close to me. But this stuff is its future because it's being built now. So it's not gonna be it's gonna be, like, 3 years minimum until we see, like, some huge effects from US based semiconductor factories. But when we do see them, that's where the progress just starts really, really going through the roof because you're not gonna have and most people if you've never seen a semiconductor factory,

Chris Carolan:

it's it's nuts. God. Yeah.

Nico Lafakis:

It's nuts like yours.

Chris Carolan:

I worked at Regaku. We sold instruments that do materials analysis on the semiconductor chips as they're being made, and that's why it's all automated and it's quality control element. And back in 2020, like, it and it kinda related. Like, you started hearing about all these shortages related to electronics, including cars because they could not make these things fast enough and, like, 1,000,000,000, probably 1,000,000,000,000 of dollars from both US and Europe. All kinds of funding being thrown at semiconductor because there weren't made there was no plants over here.

Chris Carolan:

It it was just, you know, overseas. That's when they were deciding to to, like they were planning out sites, right, to build and, like, breaking ground. And that's where over the next few years as that is a huge part of the onshoring. But, yeah, data centers plus semiconductor production ability, right, is gonna play a huge role for it. But as you're mentioning, that's a way to get around it around, like, computation.

Chris Carolan:

And I think that's without going too much farther down a rabbit hole. Maybe George can help us get out of it. This is also how you create pay to play, I would imagine, as well. Like, liquid might exist, but that's where, you know, $20 a month to use that model. It becomes, like, $1,000 a month, and it's enterprise only.

George B. Thomas:

I mean, it's interesting because the revenue model in the future is definitely gonna have to change. I don't know how, but it's definitely gonna have to change. You know what doesn't have to change though? Is AI in human collaboration. And that's what I wanna talk about today because we've been dancing around it in every one of these AI skills that pays the bills of, like, how things work together.

George B. Thomas:

And even today, like, in the stories that we're telling, it's like AI working together to create these things that make humans lives better, save humans, build things faster. And so, listen, AI is powerful. No doubt. But it's not here again to replace us. Everything we talk about on the show, I just see ways that it's like we're being enhanced.

George B. Thomas:

We're being augmented. We're being able to take ourself to a next level. So it is here to help enhance what we're doing in so many different ways. When we learn how to work alongside AI, and I am using that word on purpose, when we learn how to work alongside AI as a collaborative partner, we get the best of both worlds. We get things like Chris and I talked about yesterday, which is the speed, and we get accuracy of the machines, and we also then get the intuition and expertise that only us humans can bring to the table.

George B. Thomas:

This is what happens when we focus on the word collaboration with these things. Imagine, just for an example today, we haven't talked about customer support in a while. You're in customer support. You're analyzing hundreds of inquiries daily. AI can quickly categorize these tickets, flag common issues, suggest responses.

George B. Thomas:

Right? It makes the complex a little bit more, like, simple. And also it gives us the time that we can pay attention to the sensitive cases that come up. Listen, you just heard Niko talk about the chipset in the evaluation and the but we're gonna get back on track collaboration. Right?

George B. Thomas:

Collaboration in a so many different ways. Knowing when to rely on AI for routine tasks and when to step in as a human to add that human touch. That's what makes this collaboration effective. So there's a couple key things that I'm gonna hit on this because, again, we've talked about it in almost every one of these episodes. But if I hit the highlights on this, that I would hope that people that are paying attention to show paying attention to us would would just fundamentally understand AI is great for processing data.

George B. Thomas:

Human insight is key for complex nuanced decision making. Use AI for routine tasks, allowing you to focus on the creative and strategic elements that need your human expertise. And think of AI as a collaborative partner, enhancing your capabilities, augmenting them, growing them, making you and your work more efficient. So collaboration with AI in humans, that's today's skill that pays the bills.

Chris Carolan:

Let it in, folks. And then stay in. Stay in with it. Don't leave. Right?

Chris Carolan:

Like Oh. Don't just be like, okay. You can manage this whole house now. Come back next month. See how you're doing.

Chris Carolan:

No. Don't do that.

George B. Thomas:

Well, that's not collaboration.

Nico Lafakis:

Not yet.

George B. Thomas:

That's passing the buck. Right?

Chris Carolan:

I mean Delegation, abdication come to mind.

Nico Lafakis:

Well, you know what? And to that effect, like, I was about to even step into meaning, and I I forget for a split second about a a story that I posted to you guys this morning about Coca Cola, like, reexamining its ads. And on top of which, content creators reexamining the use of long form content. Because what happens with short form versus long form? What's the the biggest difference between the two?

Nico Lafakis:

What's the difference between a meme and a painting? Meaning. A meme means something. It represents something. A short form video represents something.

Nico Lafakis:

It is a small portion of something. It is a quick joke. It is a quick advertisement. It is a quick promo. But long form has meaning.

Nico Lafakis:

A painting has meaning. I have to go over the entire canvas. I have to scrutinize every piece of the of the painting, see how it makes me feel, the emotional feedback that I get from it. A meme is just one and done. I I see the picture.

Nico Lafakis:

I know what it is. It's a joke. It's, meaning this thing, and I'm done with it. Right? Has no has nothing.

Nico Lafakis:

It doesn't hold my attention. So it's gonna be the same. I'm I'm starting to see the same. Disney's talking about it. Coke's talking about it.

Nico Lafakis:

These are 2 huge advertisers, 2 huge content creators, and they're starting again, they're talking about long form. And, you know, me personally, the guys know, it's rare you see me post a short form video. I try. I really do try. I think, like, some of the shortest content I I take in is around 10 minutes or so.

Nico Lafakis:

You know, even when it comes to stories that I find, a lot of people ask me I did this on a whim. So if if you guys wanna learn how to do this, we might put something together soon enough to to show some people tips and tricks like this. But, someone was looking for it's a great little example. Somebody was looking for I think it was on Upwork or something. Was willing to pay $2 to a person that could take Google alerts, make them get published to Google Sheets, and pull in, like, stuff from the alert and then format the content within the cell of the sheet so that they could just copy paste.

Nico Lafakis:

Now I don't care what the end result was. They're you know, doesn't matter to me. $2. I was done in 35 minutes.

Chris Carolan:

Yeah.

Nico Lafakis:

I don't I don't know jack about programming. Wrote a Google script inside of 35 minutes to connect Google alerts to your Google spreadsheet. So how does that help me? I didn't get the money, by the way. I I actually a friend of mine sent me the thing and I sent it to I sent him back the code and the the example literally.

Nico Lafakis:

And I told him, I'm like, here, you know, take the you know, get the money if you can because my whole my point is twofold. If you know me, you know I hate hate hate taking money from people to learn how to do stuff. I think education should be free. I think it's really terrible if you're charging people for something that they should know. 2, I don't like stealing money.

Nico Lafakis:

This person probably thought that it would take days to get this thing done. Maybe a week, 2 weeks. Who knows? No. 35 minutes.

Nico Lafakis:

And I can't, in good conscience, say, yeah. Give me $2 for 25 minutes worth of work. 35 minutes, whatever it was. It's ridiculous. I'd love to say I'm worth that.

Nico Lafakis:

I would love to, but Claude's worth that because Claude solved the problem. Claude's worth the the $2. Right? This guy with that $2 could have solved who knows how many problems for the entire year. So the the plus side to it, though, is since I figured it out, now I have a rotating feed in my Google Sheet that just pulls in nonstop story pre formatted for me.

Nico Lafakis:

So, yeah, I mean, like, that's just me. Right? Now I I have, like, a way less place to go to. Don't even have to go to Feedly. I get exactly the stories that I want pulled in the way I want them with the URLs so I can immediately jump to the story and go pull source if I need to.

Nico Lafakis:

That's the world we're headed in. That's the direction we're headed in. And if you haven't figured it out by now, I don't know what to tell you. This I talk about it with Claude pretty often that I feel like people are have not quite caught up to where things are at and that this stuff might just, you know, over like pass them up too fast, too soon. And especially with where we're going now with Agentic stuff.

Nico Lafakis:

If you haven't seen what the regular models can do, what agents can do will blow your mind entirely because it's like asking a model to go do several things at once and just having it done in less than a minute.

George B. Thomas:

Yeah. They need to just wake up with AI.

Intro - Outro:

That's a wrap for this episode of wake up with AI. We hope that you feel a little more inspired, a little more inspired, a little more informed, and a whole lot more excited about how AI can augment your life and business. Always remember that this journey is just the beginning and that we are right here with you every step of the way. If you love today's episode, don't forget to subscribe, share, and leave a review. You can also connect with us on social media to stay updated with all things AI.

Intro - Outro:

Until next time. Stay curious, stay empowered, and wake up with AI.

Creators and Guests

Chris Carolan
Host
Chris Carolan
Chris Carolan is a seasoned expert in digital transformation and emerging technologies, with a passion for AI and its role in reshaping the future of business. His deep knowledge of AI tools and strategies helps businesses optimize their operations and embrace cutting-edge innovations. As a host of Wake Up With AI, Chris brings a practical, no-nonsense approach to understanding how AI can drive success in sales, marketing, and beyond, helping listeners navigate the AI revolution with confidence.
Nick Lafakis
Host
Nick Lafakis
Niko Lafakis is a forward-thinking AI enthusiast with a strong foundation in business transformation and strategy. With experience driving innovation at the intersection of technology and business, Niko brings a wealth of knowledge about leveraging AI to enhance decision-making and operational efficiency. His passion for AI as a force multiplier makes him an essential voice on Wake Up With AI, where he shares insights on how AI is reshaping industries and empowering individuals to work smarter, not harder.