
Tricky Bits with Rob and PJ
Rob Wyatt and PJ McNerney discuss the latest and greatest news in the tech world and to figure out where things have been, where they are, and hopefully where they are going.
Tricky Bits with Rob and PJ
Apple Intelligence - Where is the Promised Magic?
Enjoying the show? Hating the show? Want to let us know either way? Text us!
25 years ago, Avery Brooks of Star Trek: Deep Space Nine fame, was on a commercial for IBM...where, after decades of promises, asked the question that was on everyone's minds: "It's the year 2000...I was promised flying cars. Where are my flying cars?"
In a similar vein, this latest episode explores the current state and historical context of Apple's AI venture, Apple Intelligence. The Steve Jobs's era ended announcements with 'available today'; and yet the present-day promise of AI features have yet to materialize.
Is Apple able to deliver on its AI promises? Is anyone? Either way, what are the challenges and pitfalls and utility of any approach?
Come join the fun with the latest episode of Tricky Bits
Welcome back to Tricky Bits with Rob and pj. Uh, folks, we know it's been a little while again, but we wanted to have a bit of fun today. you know, I remember few years back, Steve Jobs would get up in front of one conference, the other, whether it be WWDC or a, a bespoke announcement, he would describe in great detail a new revolution that they had created from Apple, whether it was the iPod, the iMac, the iPhone, and he would always end with the words, and it's available today. Fast forward to last year, mid 2024. And we've got Apple Intelligence, which is one of the entries and the latest zeitgeist for the race for ai, where it's going to read your email, it's gonna tell you when your mom's coming into town. It's gonna set up things in your calendar. It is going to be that magical personal assistant that you never know you needed, but never knew you needed, but you do. And Rob, where is it at this point in time? been waiting for Apple Intelligence. It's, it's on the way. We sold the iPhone 16 on it.
Rob:That's a good question. I don't think you're gonna get it in the iPhone 16 timeframe. So all those people who bought it, all those ads on Apple's website that have Apple compatible with Apple in. Intelligence and all that is going away because as of yesterday, apple are now pulling all the Apple intelligence marketing material because somebody recommended they do some better. Business Bureau was like, yeah, you, if it's not a valuable, you shouldn't say it is. And but that is the ult ultimate question of like, where is it? And Apple have not done this before. Yes. They've said like The Vision Pro, it'll be available in six months or this model of this phone or pad or Mac or whatever will be available on this date. And it might be a week out, it might be a month out. They've never said it'll be a valuable and sold a product on it. And it's still not a valuable, it's, it's very not Apple to do this, but I think they realize how shit they. Apple intelligence really is, and they've marketed the hell out of it. I mean, they even pulled the YouTube ad. They had all those ads where you could sh it's kind of creepy, but you could sh they in the ad they show a photo of sh a girl shows a photo of like a professor or something and says, who is this? And when did I meet them? And Apple Intelligence answers of like, it was this. And it was this time when you, uh, met them. It's kind of creepy'cause it's full on stalker, uh, vibes from there. But it's functionality which they wish existed and it's never going to exist just due to all the privacy concerns. I mean, on one end, apple saying, oh, we were a very private company. We could take your data. But then you can say, who's this? And it'll answer you. It's, it's utter bullshit from day one. It was, it's very much of, we could, AI could do this and it won't be our ai and it won't be an AI in the near future, but here's a world where this technology makes sense.
PJ:Apple made this tremendous announcement last year. It was very forward looking now we're getting a sense of ex, I mean, it's nearly a year later. We are nearly at the next WWDC the only signs of Apple intelligence that we've had so far are. Some slight cosmetic changes to Siri as far as I can tell, a connection to open ai. And now I can do image playground on my phone as well as this new button I've just seen called Visual Intelligence that might be part of the, the latest beta. So we've got all of these AI esque features that have come together, but we don't have the promise that was talked about with private cloud, with a private model. You know, to you, none of this stuff has actually appeared.
Rob:I'm not sure it's going to appear as they originally intended. It's all fits into the force in AI down our throats, and they don't know where the market's going. I think Apple's being very reactive to the AI world, where usually the very proactive they go, this is what we're doing, this is what the product's gonna be. Where for ai, they've been very reactive in responding to what others in the market are doing. And when they made all these announcements, maybe that was the forward looking thing and now a year later and a year in AI terms, it's a hell of a long time, a year later. Now what they originally planned, some of it makes sense, some of it doesn't make sense. Um. I still can't get over the fact that it feels like they're just forcing it down their throat. This is Apple's moment where my same as Microsoft when they missed mobile, this is Apple missing ai, and they just don't understand it in the way that we want to use it or not. I don't want to use it at all, to be honest.
PJ:in terms of the missing, it really has a presumption that there's something there. Now, I'll fully admit, uh, I've actually been enjoying using Cursor for various different projects that I've been pulling up really quickly, whether it's a, a app or at a backend, or I've even doing some embedded stuff with it, which has been kind of fun. but beyond that use case for me and maybe doing a little bit of just of, of research stuff, Rob, is there a, what that Apple is missing on right now that's in the greater marketplace? Like, where are they being left behind?
Rob:they don't really have anything that makes sense in their product range. They are throwing stuff at the wall to see what happens. They've got the fancy removing objects from photos and whatnot and, but where would it make sense, Siri? It would make a lot of sense at the Siri level if she had context and, but she doesn't, so that would be the obvious use case. If she's, if they're gonna keep this intelligent system, which I'm not sure Siri should even exist anymore, but if they are gonna keep it and they have this ai, why are they not putting them together? And they've just announced they're delaying the new Siri again. So they obviously are working on this, but where is it? No one's even seen a demo of it and which is very rare for Apple. Like usually some reporter or TV person or whoever it may be, some person with a lot of exposure gets to go to Cupertino and see the demo hands on. And this comes back to what you originally said. They used to have live demos on stage where Steve Jobs would just do something and it would work or sometimes not work. And I. Steve had an amazing ability to dig himself out of any hole and still come out like the hero. And I think that's why Steve did live demos'cause he didn't care what happened because he knew in his own confidence that he could get out of the situation that the demo would put him in. And they had lots of demos that didn't work and it always worked in the end, at least worked in their favor. I think when we got to Covid, we started doing these presentations via video, which they could be very carefully edited and very carefully crafted. And we went away from the live demo. And I think now we just in an extension of that where it's utter bullshit that they just pull out of like, look, we could do this and we made a cool video. And no, none of it's real. And to the, and I say that because no one's seen it. I mean it, we went from a actual live demo of valuable today to a video that. Utter bullshit with all the ads that they had. Also utter bullshit without anybody even seeing it in person. And then Apple's so secretive, nobody inside the company's gonna talk about it. And we have had reports of, oh, this team got fired and replaced by this team. It seems like very much they don't know what they're doing either. They're good at making videos though.
PJ:well, so is Hollywood. I mean, we, we've had the promise, since minority report of this magical AR interface that has never come to be. So, you can make a good video and, and concept car this stuff out, but there is a real difference, as you're saying, to actually delivering on it.
Rob:Or show it to somebody like nobody has seen even a hint that this could exist of, yes, we could have this ai, we could have Jarvis, and we could have the minority report displays and whatever. These things could exist. Yes, somebody thought of it for the movie. Somebody could think of it in real life, but you'd think if they go to show a movie of who, who's this person, and when did I meet them? They'd at least have a demo of that, where it's kind of clunky and kind of broken and not ready for release. But the concept is there. Nobody's seen it. So does the concept even exist or are they just literally full of shit?
PJ:It is a really dangerous place to be if, if they truly are full of shit. The progression that we've seen out of Apple has been, I talked about at the very top at its available today. a, a difference of the Steve Jobs era, where for the most part it was, it's available the day of announcement, then it moved to, it will be available, at a particular fixed date. For example, the release of a given operating system, room up or an upgrade to iOS. then finally we got to the point of what we're seeing now of this could be something that we can do at some point in the future where it became late fall, early spring, a lot less refined around these dates. do you think are seeing an erosion of apple's credibility in the marketplace to deliver on innovative features?
Rob:Yeah, they haven't innovated since a decade ago. What have they done that's realistically new? Very new, like true innovation. The phone was obviously fantastic. The iPod was fantastic. The Mac, the I original iMac when Steve Jobs came back. That was groundbreaking at the time after that though, which kind of being like, well, everyone else is doing this too, and well, some people are doing it better, so we should copy them. The actual innovation is gone. I mean, even the watch, yeah, it's a decent watch, but Fitbit already existed and the idea of a smart watch already existed. Apple just did it slightly better because of their ecosystem than everybody else, and I think. The watch played into the ecosystem where I think the in vision process, the ecosystem kind of failed them because it, it's not a good ecosystem for the AR world, but they had to integrate it and they did the best they could, but it's still clunky. Um, that's their last product. And I think the market responded the same way in respect to what I'm saying. It's like nobody really wanted it and I worked on the damn thing, so, and I knew at the time I, nobody would want this. But yeah. To answer your question, innovation is not a thing in Apple anymore. It's like rinse and repeat is what the cycle we're in. Like what's new in iOS for the last decade.
PJ:I, I mean maybe they'll add a fifth camera.
Rob:Exactly. It's basically a camera. Your phone is now a camera, and all the AI processing that it does out of any use is related to editing your photos.
PJ:Are we seeing a pattern then, and, and maybe there's a broader marketplace question here. We don't, we don't have to get into, but where. Running after ar now running after ai. we desperate to get a new paradigm shifting innovation? Do you think it's actually market forces where it's like, Hey, we're addicted to fast change, big change, innovation. Look, we've seen the internet. Hell, even wifi was an incredible innovation at the time. We've gone to mobile phones. Like it's, it's almost as if like, we need something new just to keep the whole thing driving forward.
Rob:Oh for sure.'cause markets are, where are they growing? Can they keep selling the number of iPhones that they're selling? They can sell'em to existing customers, but are they actually getting new customers? Are they taking a dent from Android?'cause it's the only other place to take'em from. yeah, they're probably growing into third world countries slowly. And so is Android. But are they actually getting a bigger market in actual number terms? And if not, then who are their new customers? Like this all capitalism. Yeah. Infinite growth forever. It's it some point, it comes to an end and I think. For people who have picked the camps, most people who have an Android phone don't want an iPhone. And most people who have an iPhone don't want an Android. There's a handful who switch back and forth or work on both or things like that. But generally people have picked the camp and they stay and they're staying there. So I think Apple have to go after these new markets because it's a, it's expected by the shareholders. And B, they do fear getting left behind. Inside they have this, oh, ment we've talked about this a thousand times, of like, oh, we're the smartest people in the uh, room, but they're also incredibly insecure about that statement that they like to make.'cause even they know they're full of shit and they're not actually the smallest people in the room. The smallest person is in that company over there doing something cool. And I mean, Apple's never been first. They've always kind of been second and done it better, but, but even that's going away. The AI is not, they're not even second, the 10th and 10th in line. It's like everybody else seems to be doing it a lot better. And I think Apple's ecosystem plays into this. Like, we couldn't do a copilot thing for OSX. You could, anybody could have wrote copilot preferred windows. Microsoft happened to do it themselves. But anybody could have written that, uh, software because the APIs that Microsoft use are available to everybody else. They may not be well documented, but they are a valuable, and if you call'em and you're not supposed to, all the documentation says is not recommended. You call this, they don't say, you cannot call it where it would be impossible for me and you to write a system level AI component for OSX. It would be utterly impossible for iOS. This walled garden is stopping me and you or people over there, more importantly, who are very good at doing ai, integrating it with Apple. You could do an app but you couldn't system level integrating and there's a big difference there. Only Apple can do the system level integration and ultimately that will bite them because they are not the smartest people in the room and they are very slowly figuring this out. And I'm not sure they know how to fix the problem cause they're not gonna open it up.
PJ:How much do you think this is a a between a rock and a hard place for them? And let's frame it in. Apple has touted for a very long time. We want to be the most private, we wanna be the most secure of devices we want to be. We want to specifically lock things down to keep it safe for you. this was a decision that was made very early on. They've stuck with it in many cases, even with like the Mac Os. They've gone even further into it by sandboxing apps and making sure that they can't talk to each other. Now.
Rob:They've done it the crude way though, haven't they?
PJ:jewel for them. But in this case, like are they stuck because, do they have to break old promises in order to create this innovative environment you're talking about Rob?
Rob:It's all or nothing is at the moment. Yeah. Initially it was a good thing because my mom doesn't know anything about settings and how to configure things and it just works and that's good until it doesn't just work. And now they've dug this huge hole where everything is in the hole no one can get out of. If I choose to use another app store, I. Then why can't I make that decision? I know this app store's dodgy and I'm willing to do it. Maybe it's a burner phone. Maybe I have to do it because my work tells me I have to do it. Whatever it may be. This mentality already exists. Lockheed Morton have their own app store. Apple internally have their own app store. So app stores exist with apps that are not signed by Apple.
PJ:Mm-hmm.
Rob:Apple give them a enterprise master certificate, they get to sign their own apps and you can download it from the internal store. No questions asked of what those apps do. Um, obviously for Lockheed Martin, it makes sense because you can't be sending this app to Apple to sort of,'cause it's probably secret or has top secret clearance or things like that.
PJ:yeah, yeah.
Rob:So those apps can't be sent out. So the, the mentality already exists that this needs to happen. It just, why can't it happen on a bigger scale? Like if I want to go to the app store where I can get, uh. Dodgy software. Why can't that be my choice? If I'm giving up my security or my privacy, then that's my choice
PJ:So I framed it a second ago as being a philosophical problem, uh, with privacy and security. Uh, now I'm gonna be a bastard and frame it as a financial problem because to your point earlier, apple is slowing down in terms of its growth in terms of how many new devices it can sell. as of its last earnings report, what was its greatest area of growth? I.
Rob:Services.
PJ:Yeah, so it actually creates a dis incentivization for Apple to want to open stuff out to other people's stuff rather than directing them to their own homegrown services that they can create and sell.
Rob:That's true, but they're not the best at doing these services. When we go back to the Apple intelligence mindset, they are not the best at doing this. So they have to open it up to others and they are doing that with open AI and things like that, but they kind of do it in a very crude way of like, oh yeah, we could stay private if we do this, but we can't do this and stay private'cause we have to send you data to open ai. And of like they did it the way a high school kid would do it. Oh yeah, we'll draw, we'll do this. And then hard switch. This goes to open ai. It's if they were willing to put resources or engineers on this, they could have done a much, much better thing. again, I think yes, the services are big and they want to grow these services, but are they just gonna become a services company? If they're just gonna be a services company, why don't they just do all these services for Android? Because there's a massive market that they're not even tap yet. I, I think internally they're conflicted. I mean, yes, you can get Apple Music and Apple TV for Android and uh, things like that, but, so they know that market's there, I mean, they're not morons. Well, they are, but they're not. And but why is iMessage not available for Android?'cause it's the golden goose. And is RCS gonna change that? Maybe, maybe not. It's, I dunno. It's, it's interesting. But the Apple intelligence thing is. A, a problem that doesn't fit in this space. It's like, it's not a good thing to have in a, in a walled garden. Why don't, and they get to pick who you form your data out to. Yeah. It's like their business deals are all this of like, why don't they just make it so there's an, a service interface where others can plug into it and then anybody could provide the backend.
PJ:Well, I think it's this confluence of these constraints they've put on themselves. They wanted to be the most private. I like that. They wanted to be the most secure. I like that too.
Rob:I like it too until I don't like how do you turn it off? Like right now, I don't want the security. I need to do something that I know is dodgy. I decided to do it. I want to be able to turn it off. They've, they're very black and white. If like, if you have access to this, then the entire system falls apart. It's not a very, it's not a very secure, in a modular sense. It's, there's a wall around the outside of the whole thing. And if you get in the wall. All better off Where, so it's actually bad engineering in the grand scheme of things. It's if you look at what other operating systems do, it's like, yeah, this is still secure. We can still have security and privacy even if you have full access to this part
PJ:poke on this for a sec, Rob, because Apple for the longest time really has been a consumer electronics company. It's a consumer hardware company, and I'm hammering on the word consumer for a second here in. Contrast to someone like Microsoft, which obviously sells consumer software but is vastly more friendly to developers, either in its operating system or the tools it provides, like everything that Microsoft provides is really first rate for developers. Apple seems like it has a very tense relationship with its developers. It needs them to provide apps, but also sort of hates them at the same
Rob:Oh yeah. It at Apple are the absolute worst. It's, I mean, even X Code, it doesn't even give errors in the same way as other development tools of like, just tell me what module you're compiling and tell me the error in that module. I don't need you to, take them all and put them in a window for me in some order that you decide. it makes working on an apple very different to work. And even the languages they use of like, you will use Swift. It's like, I don't wanna use Swift, my engines, and C it's, everything they do is not normal.
PJ:Well, it's under that aspect of control, right?
Rob:Yeah, you, if you wanna work on our platform, you will do this. You will use Swift if you want to use Modern UI. And if your engine game engine, for example, or Apple or whatever it is, is written in C, let's say the game engine,'cause it's a real hard one and it's caused a lot of problems for me personally, is, okay, so I've got this engine and it's written in C and it's got a backend for Vulcan and it's got a backend for direct X, which are both CC plus plus, and now I need to do a Mac version. So now I have to deal with objective C'cause I'm not gonna deal with interface into Swift. And so now I have to do an objective C backend just so I can call the APIs because metal has no C bindings at all. That's just horrible thing to have to deal with. Lots of bugs, lots of marshalling of data. Just, just awful. And they could just provide a c interface for this one use case. They, they could say, recommend you don't use it, but it's here if you need it. Which Microsoft do all the time
PJ:again, does this derive from a philosophical issue that both companies, you know, differ on? On one hand, Microsoft is very game friendly. it's been in their DNA for the last several decades apple is not,
Rob:games, I think is gonna go the similar way as the ai. I, I think they, on one hand they are, apple is the biggest gaming platform, um, because they have the, well, they are, they have the phone and they have the Apple TV and things like that. But, but no one thinks of them that way. When you look at what Microsoft, Sony do for actual game developers, it's an order of magnitude different and 10 orders of magnitude better. Apple could have been the way, the perfect position to be the, the winner in the game space. They could just make it a more slightly more powerful Apple tv and it would be as powerful as a PlayStation but they're not willing to document the hardware. They're not willing to help you make things more performant. It's all will write you a tool to do this, will make a better compiler, will make new hardware, blah, blah, blah. It's, it's the very software engineering approach which games still haven't caught up with. The games are like, this is the hardware we have, we're gonna make it go as quick as we can they have the mindset that we need to do this'cause this is our modern GPU's work, but the ecosystem that goes with it just isn't there. Actually talking of the GPU APIs brings us back to the 10,000 pound elephant in the room.
PJ:Cuda.
Rob:that's kuda. Yeah. If like all AI outside of Apple is pretty much done on Kuda at some point, especially for the training side, the inference side may be not so much, but the training side is all cruder and which is why Nvidia has such a huge position in the AI world.
PJ:yeah, most of the inference, to be fair also is happening just as a slight infection, that can happen locally or on other devices besides Nvidia, but. We can also confidently say that since most inference is also in the cloud, inference is happening on Nvidia.
Rob:Yeah, so how does that affect Apple with going back to the Apple intelligence of like, are they doing it with Coda and just using the cloud or are they trying to do it in-house? Which also puts'em at a disadvantage because now they have all this open source r and d code that they could use, but they can't because it's all written in Coda. And having been someone who's ported Coda out of Coda, it's incredibly difficult. Coda has a lot of built-in things that just make things work. Maybe not the most efficient way, but it does just work and getting that outta there and then having to kind of rearchitect. The same sort of scaffolding that it needs becomes very difficult. It's not always easy one-to-one mapping. Even if you app only, you have access to the internal APIs, which ironically RC and It's a problem I don't think they know how to fix because they've burnt the bridge with Nvidia so they can't go back and they can't make a version of Cuda. So what are they doing? And they're just doing it in metal code and kind of ad hog in it together. Not the best way. And I think that's part of why they're so far behind is they can't use the tools that everybody else uses. They have to do it in-house, and they're not doing it in a way, they're doing it the Apple way, which isn't the way that this market went.
PJ:It's another useful data point to talk about it from the philosophy.'cause like on
Rob:I.
PJ:apple has made a huge investment into its own silicon. It's designed its own neural engines. It has the unified memory. It's, it is touted these things in, in terms of being so powerful. Now, now they have a choice, you know, for these clouds they want to build. Do they do this route? Which it sounds like they are of. it with metal, trying to replicate all this stuff. Really reinventing the wheel for the apple hardware. Or how bad would it be as a press release if it ever got leaked or got out if they were using Nvidia hardware underneath the hood to take advantage of everything that's out there
Rob:They should stuff at that press release is what they should do and just, I mean, yeah, apple's, silicon is powerful with respect to its power consumption. Is it a 50 90?
PJ:no.
Rob:No, not even close. And no matter how you fiddle the specs or the stats or your access list graphs in your presentations. It's not, I mean they, when the M four came out, like compared it to a 40 90 in very specific cases, and maybe in a few ways it is slightly faster, but, uh, generally in a general app using general GPU techniques, the M four hasn't got a standing chance in hell against the four, the 40 90. Um, that's a 40 90. We're not even talking about the H series
PJ:Oh
Rob:cards.
PJ:For, yeah. Yeah.
Rob:nvidia, obviously not stupid. They have a lot of this stuff figured out, and it obviously works because everybody uses it. So Apple should, if they want to be competitive in the AI space where everything is done in Coda, they also need to kind of be in that Coda space otherwise they're reinventing the wheel. They have to build the platform first before they can even do any good ai. And they have to build the tools, they have to build the debuggers, they have to build everything, and they have to kind of re-implement everything that's already out there. They could just take it and, uh, use it and they do that. I mean, you look at, uh, PyTorch, there's a metal implementation of PyTorch and blah, blah, blah, which makes other people's AI useful on Apple. But at the big scale, no one's using PyTorch, the big scale. It's all done natively.
PJ:so you've been inside of Apple. come to what I think is a, is the correct engineering conclusion, which is screw trying to do all this stuff for training on apple, silicon. It's fine for inference. need to just step up, the NVIDIA hardware, start using that, at least for our backends. How does that go over in terms of the executives in terms of telling that to an exec who's invested billions of dollars in apple, silicon, as well as all the marketing credibility?
Rob:it doesn't. That's why it hasn't happened.
PJ:Okay.
Rob:Yeah, you'd never sell that inside Apple. Not at all. It would be a huge, huge PR executive problem. I'll never forget, the executives are fairly disconnected from actual bits that happen inside the processor. So they work on these, they work on the same specs that they bullshit to everybody else.
PJ:So let's, this is an interesting contrast though, because know, I don't often put Google into a favorable light too often, but I. develop their own TPUs for doing ai uh, operations, also have zero problem, saying Yeah, we're, we put in a massive purchase order to Nvidia to pick up a whole bunch of these cards.
Rob:That's the Google approach. It like Nvidia, do this better. We'll just use Nvidia. We are not in the market for remaking all of this. We're not in the market for reinventing the wheel. We'll just use them. Yes, it's a competitor, but does it matter? they have thet p use for inference. They can train on this side. On the Nvidia side, they can infer on the TPU side they can learn how to make better TPUs based on what Nvidia have already done. Apple just don't do that. It's just not in their DNA. It's like, we'll do it in-house and Will or, or will buy something, but there's nothing to buy.'cause everybody's held owned, all the big AI companies are all related to other$3 trillion companies.
PJ:Yeah, they, they really all have been snatched up. And, and again, I mean, to
Rob:I mean, they, they do hire, they do poach back and forth and things like that and blah, blah, blah, blah. But one person can't make this happen.
PJ:play out the scenario of to buy an AI startup. So Rob, you're running an AI startup right now. hardware are you using to train your models and do inference on them?
Rob:Yeah. There's not even an answer, is it? It's it's Nvidia,
PJ:right. So if I bought you, is equivalent to me saying I need to go get Nvidia hardware at this particular point in time.
Rob:yeah. And it'd be easier just for them to do it, but I, I mean, they have the, they have the people. I think they're struggling because of other restrictions. The wall guard and the hardware and things like that. They open up a lot of research papers and they do document a lot of what they're doing, so they have smart people. Why is it not playing out? It's something else. It's not just, they may be the smallest people in the room, but something else is crippling them.
PJ:Let's look at like, you know, kind of this idea of we need to do it all ourselves
Rob:Well, let's, before we go there, let's, let's talk about like, why is it, uh, crippling, first of all, it's like, is it even needed? Do, does anybody want this product? Is the first thing. I think that's a crippling, uh, question. And secondly, how AI to now up to date's been, it's been like it'll hallucinate, it'll tell you wrong information. It's, it's only as good as how you interpret it.
PJ:Yeah.
Rob:can't rely on it being a hundred percent correct. And you can see that by Google's, uh, AI overview. When you do a Google search, it's like, it's mostly right, but sometimes it's horribly wrong. and it's only when you know the subject that you go, oh, that's actually very wrong. How often is it wrong on things you don't understand? So well. Is that a place where Apple want to be in this kind of Yeah, it's kind of a gray world of like, eh, it's full of shit, but, or it's elucidated to just make something up'cause it doesn't know the answer at all. Um, you see this in code where you use like co-pilot or cursor and it's like if you start getting in, if you're doing JavaScript for like how do I draw a circle in an HTML page, it'll get it right every time. But if you start going in the weeds of like, how does this, how do I program Nvidia GPUs directly register level stuff, there's very little other than, than Nouv driver documented on the internet. So it doesn't have much information, but it will attempt to answer it and generally gets it very wrong and it will take you down this wrong path, but. Is that a place that Apple want to be if they integrate this level, current level of AI smarts into Siri should be worse than she already is. go back to the original example. Darren and Fireball had the same example in theirs too. And it was an Apple, uh, it was an Apple example to begin with it, of like asking Siri, what time does my mom arrive at the airport? And it's, it's a deep question when you get into it because A, where's the information? Is it in an email? Is it in messages? Is it FaceTime? Did someone just write it on a note? Wherever it is, it's somewhere in the Apple ecosystem of data. Is the information about my mum arriving, but where, where's she arriving at? Is she arriving at the airport train station? Who's my mum? Who does, how does Siri know who my mum is if I don't have a contact called Mum And. If, if you say, oh, my mom's name is whatever, then she has more information. But that's context. Then sim it doesn't current have any context and it's, that context gets much bigger if these questions get more difficult. Let's say ask Siri to book me a trip to San Diego. well, now she has to know when you're going and what you're going for. do you want to get there a day early or maybe go for a conference and it starts at 9:00 AM and you wouldn't wanna fly before 9:00 AM you'd want to go the day before. What sort of hotel do you like? Uh, all of these personal preferences that she'd have to ask you. And she could ask you a million questions and you could answer them, but you wouldn't remember them. So it's, I I think this back and forth context, building with a, a smart assistant is only useful if it remembers it. If I have to tell it the same answer, the same questions every time. It would be a lot of work where if the, she was smart enough to go, oh, last time you stayed, you'd rather be downtown than by the beach. Maybe that's where you want to be. Maybe you're going to sit that doesn't have a beach, so therefore it's not even a question you'd want to ask. If you booked me a ticket, you'd ask three or four questions and then you'd just do it. And if you booked me another ticket, you'd be like, okay, I know how this is gonna work out. How do you get that level of context? They can buy training data from open AI or Google or somebody, and they're not training it on your data. So they have a fully trained model and it doesn't need to train on my data to read my email. So Apple's AI backend can read my email and get context of what my email says without have ever seeing any of my data. That's something that they could purchase and, but how much of my data does it need to have to be able to be smart enough to have context for a simple question, what time does my mom arrive at the airport or not even at the airport? What time does my mom get here?
PJ:there's that longevity question you're getting into, I mean, which is again, the, the tension of the privacy of the data, right? I want a personal assistant that's personal to me, for that assistant to be personal to me, so I'm not answering the same questions you're saying three or four times over and over and over again. Uh, data needs to be stored someplace. And ideally it's like I can ask on my phone, I can ask on my computer, I can ask on my home pod. So this data isn't stored locally to any of those things unless you're syncing across the board all the time, which probably wouldn't work that well anyway. So there's a, there's this question then of, okay then where is this private data?'cause it's a lot of personal information about where I live, you know, what my credit card number is, like where I like to stay, what my location's going to be, is now gonna be held by somebody. And I think Apple wants to bank on the idea that, Hey, we've been trustworthy with your data so far. We're gonna hold this stuff in the cloud in some kind of secure, private location. that's the only way that you like, get away with creating any sort of notion of a long context, is to be able to build up some corpus of data
Rob:Yep.
PJ:itself to you. But by necessity, that's personal data.
Rob:That's highly personal data. That's not even, that's not even like, yeah, you read a few of my emails. It's just like, where are you right now? Think if you had a personal assistant. You have someone who works for you daily and she or he is your personal assistant. Think how much they know about you. I mean, go into the celebrity world of personal assistance there. There's NDAs and things like this.'cause they know everything about you. do you really want that to be in, in a Apple database somewhere? Even though they've never had any big, big data breaches, doesn't mean they won't have.
PJ:Well, I mean, as the data value goes up, the,
Rob:The, opportunity cost goes up too. Yeah. It's of like, or like they've, maybe no one's ever looked.'cause it's like, who cares? I mean, the best way to not get hacked is not be relevant, but if you are now the most relevant target with the best information and then how does fit in law enforcement, are they going to open this up? Like your actual personal assistant probably won't tell the cops where you were. Would Apple, if the cops show up with a, with a, uh, subpoena, is this truly personal, always encrypted? If it's encrypted, how are they reading it? There's all of these questions of like, well, if you can pass it between my devices and somehow have some context to reply to, you obviously interpreted that data, whatever it is, whether it be encrypted in transport or not. It's usable by you at the back end. It can't all be local'cause it's too much power. it's a massive problem as to how you do it. Even if they don't use your data to train. Like I said, they could buy that data. they can get the ability to pass context without any of your data, but when they, soon as they start applying that to you specifically, they have a little database with all your information in it of like, that's highly personal data. That it's like, I don't care, like credit card number, don't care about it because I can just get you a credit card,
PJ:sure.
Rob:like, like a a thousand websites. I'll probably have my credit card number and it's probably publicly on the internet too. Uh, it's just a thing, you know? Yeah. It's sometimes some data has to be locked down and sometimes it's better to approach data is like what would happen if somebody got this? And like if someone gets you, I'm in the mindset now that social security number is just public information So let's assume it's public, save it's credit card number, assume it's public and there are other ways to protect it. And, but some of this data can never be in that state. So it has to be private. It has to be secured. Yeah. That's why we have safes, that's why we have things. We put it away, keep it safe. And why would we give all this information to Apple? And if we don't, then Siri will never be useful. No matter how smart she gets at the backend, without my data and without context of historical data points on me of where did I stay last time I went to San Diego, why did I go to this conference? Why, what are just my travel preferences, my credit card number, what just preferences across the board? And then extend that to kids and your wife and your business as, uh, associates. Things that I could ask an assistant and with maybe one or two questions back, they'd figure the rest out because they, they can go find my flight details and put you on it, blah, blah, blah, of like how in this miraculous world that they put in videos of where this just works. The details of how that just works are kind of scary.
PJ:it gets to this conflicting set of constraints and the privacy thing is a big constraint. And, you know, we're touching upon it that, Facebook, Google, they're able to create statistical cohorts. like, you and I might get served the same ads because we're round about the same age and, you know, demographics, blah, blah, blah, blah, like, but to take it to the level where it's useful to rob where it's useful for pj, there's a lot more personal data that's needed. then on top of that, there's the question of Apple, even though it's growing more in the services space, has not traditionally been a services company. So there is a question of how you build up that set of competencies there. And then on top of that there's, oh, we don't get to train on the same hardware as everybody else. We have to do it ourselves on apple silicon. So now we've got all these constraints layering on top of each other with all that in play, Rob, does it then make sense why, you know, this stuff is like taking forever or may never appear?
Rob:I rather it never appear camp because unless it's actually a personal assistant, it's still gonna be clunky and useless. It's still gonna be like, I'm done with it. I'll just do it myself. It's the data problem is the crux of the problem. The, the technological backend and all that. They can work through that or buy Nvidia or. And not tell us or whatever they wanna do. I don't care. It's the data problem of you can't be useful if you don't know anything about me.
PJ:Yeah. Yeah. I.
Rob:it's, it's, you just can't do it. So where does that data go? And would I be comfortable with Apple with all of their privacy cons, trades that they've had over the years? And it's, I don't want that level of information about me to, to leave my filing cabinet.
PJ:come back to this, this one because I, I think this is actually a lot of fun, but let me detour for a second and then round back,'cause I know it was a question I asked you yesterday. I'm gonna ask it again. So, if Google had in Android, the ability to do this personal assistant stuff, you switch from Apple to Android?
Rob:I know yesterday I said I would, but I thought about it more and. I probably don't want Google to have that information either. Uh, I probably less likely to have Google to have that in that info, but
PJ:a sec.
Rob:they could probably derive a lot of it from what they already have. So I don't really know would I switch, I'd like to switch away from Apple because they're starting to annoy me and I think a killer app on Google might be enough to make me switch. And I don't use my Mac as much as I used to. I, my iPad is kind of, nah, don't use that. The whole integration between Apple devices. My watch, I hate, uh, it's, I don't need much to switch. I use Linux on a daily basis. I'm on a Mac right now, but I use Linux on a, on a daily basis. I use a bit of Windows for the graphic stuff and I don't have anything that keeping me on Apple. So if, if there was a killer app that's like, I need that, then that would be enough. Would I switch for a ultra smart assistant? Again, the data is like, what's going on with my data is the problem.
PJ:so I get it that you don't trust Apple, but the reason I was, I'm poking you on it, has to do with would you switch to Google Knowing,
Rob:I, trust Google even less than I trust Apple
PJ:Okay. So the, the killer assisted actually is not the killer app for you to switch, switch off of, right?
Rob:now. I don't think the killer assistant can exist because of this problem. If, if you are willing to give up everything that you, they could possibly want to know about you, then that app could exist and, but. I am not willing to give that up. I'm not even willing to give a lot of this to like HR of a company that I would, that I would work for of like, there's a lot of information you want tot even share with your spouses and things like that. And it's, and and then it also spreads to that. Yeah. How about my kids? All of this stuff over time this scope of data grows. And obviously if you know someone really well, they've known this and they've built up that same bundle of knowledge about you over time, and it's, unless you know that you will never be useful,
PJ:So are we saying that an AI assistant could reasonably only exist in a country like China all of your personal information is gonna be known by some government entity and its related corporations?
Rob:But is it there always gonna be context that isn't known by the government, no matter how much they know you? Which seat do I want to sit in? Do you wanna sit in the window seat? The aisle seat, the middle seat?
PJ:but even
Rob:You could take stats off the market like, oh, I was, I was trained on all of this. It's probably not gonna want a middle seat, but I'm at window. Aisle and it's like, yeah, there's data out there, but does that apply to me? If I travel with somebody else, I'll happily take the middle seat.'cause I'd rather sit with them than be in a different seat.
PJ:I mean, that's derivable data though. I could look at your history
Rob:Yeah, but you have to know. You have to know my history
PJ:Right. I'm,
Rob:and switches back and forth again. Context is important. all this higher level context and it's context on top of context, um, I don't think this app can ever exist. And I think they're barking up the wrong tree. They're doing a me too of like, oh, we need to be in the AI space and we're just gonna put some bullshit out there of what we could do. And it's never going to happen. And if that doesn't happen, who cares about the rest of it? It's the only useful AI app that Apple can have is to make sir smart. Outside of that, it's useless.
PJ:I'll recall back to what something I said a year ago ahead of our WDC or during our WWDC episode, our predictions episode. That is, I said, apple should just stay away from this because I don't think there's a place for it yet in the Apple ecosystem. they went whole hog into it and went into fantasy land. And I think where you're coming to, and I think I, I agree with you, is. Either they need to accept that this app can't exist without a, an interminable number of con amount of context that's required for it. Or have to make a really hard choice and open up the system to someone who can create set of apps where this is useful. Like beyond that, you are just hammered within the, the rock and a hard place.
Rob:Yeah, I don't think I, even if they open it up, I still not gonna wanna share about that level of data with somebody else. Un unless somebody shows up and they like, we are ultra secure. We do it this way. Maybe a new way of handling data, which no one's thought of yet, but, but that would've to be replicated today by Apple because No, they wouldn't be allowed to do it to the level required due to the closed wall.
PJ:we
Rob:So there are, there are, I mean, it's all local. Maybe you have a server in your house and it all stays there and, and it's on your network that never leaves you your own premises or whatever it is. And that's how they distribute across all the devices and whatnot. And and
PJ:how
Rob:could have this.
PJ:where I use tail scale to create a VPN for myself I'll use tail scale for that sit models on my PC and then just hit it from either my phone or from
Rob:Yep. So,
PJ:then it's not going anywhere.
Rob:so maybe that's what they do. Maybe they have a little, it's a, it's a Mac mini that just sits in your house and it, it does all of the mining and all of the context management is all done locally. And that's, that's a game changer if that's a thing they can do. But would they be able to sell it? I mean, as a technical solution, it's viable as a commercial solution? As a consumer solution. Mm. Maybe commercial, because enterprise will be like, yeah, we'll buy a new piece of hardware. It's fine consumer. How do you even convince them? They want it.
PJ:No, It's an excellent point. I mean, imagine, sitting it inside a Mac Mini or an Apple tv, like beef up the Apple TV a little bit, and then all of a sudden, oh, next time I upgrade it, oh, I could just put it there. Great. It's just, as my my little repository of information.
Rob:Yeah, and it could get information by the whole house then.'cause everybody else could use the same thing. But the whole, I dunno, it's not an Apple thing to do, but it, it's, it's a technically viable solution. Whether could they work it in it? I, I don't know. But even with that of like, how's that get backed up? Where's that going? Is that encrypted installed in the cloud encrypted? Or you can put it on another device? Who knows? would you ever be comfortable with anybody corporation having that much information? And how and where is it stored? They're never gonna be honest with you. They're never gonna say, it's stored right here. This is our'cause. If they tell you too much information, it gives you our ability to access other people's. the whole world of security is like, there's still a lot of, yeah, security through secrecy is bad, but there's still a lot of secrecy in the high level things of like, how do you even get to the data?
PJ:it is true. security through obscurity might be bad, but it's still prevalent,
Rob:but again, who's comfortable with this? I would, I still wouldn't be that comfortable with it. I'd be more comfortable with it being on a box in my house, but I'm not entirely sure. And. It's just a longer term problem. Okay. Once it exists, it cuts out the bag. is it a thing that the government could get a, hold on? Is it a thing that the police could use to prove or disprove something?
PJ:an important
Rob:sure whether I'd be comfortable with any of that. So I'm a firm believer that this app cannot exist no matter how much they make videos about how cool it's, think about that app, whether she shows the picture of the professor and be like, who is this? And when did I meet him? How does it know who he is? Does it, I said, I said, just used all your face scan data that they said they'd never used for anything else to identify somebody else.
PJ:No Rob, it uses bought face scan data. It's totally, totally legit.
Rob:how is that a viable thing? Yeah. Are they going out to one of these. Face ID websites and just giving them a scan of the face. I don't want'em to use that website. I don't want'em to have any data. I don't want them to have my data. I don't want to have my photos. So of like, there's each individual site that it's using, which we have no control over, is it's own individual set of problems. But, but yeah, that, that's just a Creeper app. Why can't you just go to some girl on the street and be like, oh, who's that? Yeah, tell me what you know about her. Oh, she lives here. She does this. Or maybe she gives you enough information that you can mine it in your head and be like, okay, I know. And, now you just made a perfect stalker app.
PJ:This is like another one of those issues where we're at the edge where like a lot of personalization is awesome, but also awful at the
Rob:not everybody has good intentions.
PJ:has good intentions.
Rob:What's your take on it? I just answered my take. What's your take on on this? Do you agree with what I said, or would you be comfortable giving your information up for flexibility or convenience?
PJ:I, I don't think there's a huge amount of cost utility for me to say, go buy me a ticket. For the amount of trouble that it could get into. And I'm using that as kind of the base example. it it really is almost not worth it to me. It is so easy to do online where I'm like comparing prices or like like 10 minutes
Rob:I.
PJ:it's still worth it to me'cause I. get exactly what I want and I'm not, don't know if I wanna pay for some AI assistant to go do that. all said, even though I don't find a whole lot of utility in it, I think this app will exist in a, maybe not in a perfect form. I think it will exist because someone will just say, Hey look, this is super cool stuff and it will be done with a lot of unintended consequences.
Rob:Rabbit tried that. Did they? They made that little rabbit box that was supposed to be able to do things like this, and it was ultimately a more of a scam than anything. But the cat was at the back and all these questions came up then too, of like, how does it know all these things? And, and it, and it's very personal. They get it. They can't use generically mine data for things like this. It has to be yours and it has to come from questions it asked, or you fill a massive questionnaire out or it gets it from somewhere else, which is equally scary. But, but yeah, it's, it's like I don't think it needs to exist. And you made a great point. I can buy a plane ticket in two minutes.
PJ:so I
Rob:I.
PJ:and again, I'm gonna be talking theoretical land'cause I I can't think through good example, but if I had an assistant that was able to do stuff for me in parallel, where it was a, it was more cost effective, but it required a whole lot of my personal information. Then in theory, I could be more productive. And again, we're in theory land at this point in time. That's why I sort of think, in China, if I have no right to my data, or if in another country where I have sort of no right to privacy in some sense, could these models be developed to make people there more productive? are there operations where it's like, oh, go out and do this, and even though you're gonna write me the crappiest code, you'll make me a, a little app that I want, which is then allows me to do something else. This is where I, I start to question like, you know, are there going to be advantages in countries with far less data protections than ourselves? Simply from a productivity and cost standpoint, and again, I'm not saying Rob, that I want to give up my data. I don't. It's really just trying to think through the economics of is there some scenario where someone can get advantaged by this stuff over and above, let's say someone in the US or the eu?
Rob:Yeah, it's a good point. But then how do I really need to save that five minutes? It takes me to book a plane ticket. I'm not a CEO like think who has personal living. Personal assistants. Yeah. Uh, a secretary or someone who is a true personal assistant. I'm not that busy.
PJ:Right.
Rob:I could just do it myself and I'm just as productive as I was'cause I've got everything I need to get done and I have time to do these things. There are people who are so busy or think they're so busy that need somebody else to do all this for them, and they're willing to give up that level of privacy to another person. And there's consequences if that person does things with that information. What are those consequences in the data world when a corporation does things with your data
PJ:Right. Uh, here's a, here's something where it borders on personal information, but here's a, an example where I could be vastly more productive. So, when I was managing a whole bunch of people, one of the things I had to do was, there are evaluation reports, and it would mean they would provide me data on what their, their code check-ins were, the design docs they created. To a certain extent, descriptions of what they did, and then I had to regurgitate that and, you know, put it into a different form for other managers to see and justify the ratings and all that. Quite frankly, some of that is corporate, bordering on personal data. I would love to have an AI that would just up all the check-ins that someone made, connect that to what the impact on the business would've been, and go through the design docs to say, okay, it should be this, this, and this, and here's the justification for it. Maybe some, some human intervention afterwards just to contextualize stuff. But quite honestly, that is something where it's like that would make my life a lot more efficient as opposed to me spending 2, 3, 4 hours per person writing up these reports.
Rob:But it makes your life easier. It being wrong has great consequences for somebody else who didn't do anything. was just misinterpreted by an AI and you didn't catch it in your quick scan of it because you trusted it. So it's like, what's the consequence of it being wrong, of like a consequence of you being wrong? If you continuously get performance reviews wrong, you'll get reprimanded by your manager or you'll get fired, or whatever it is, there's a consequence for you not doing those performance reports properly. What is the consequence when the AI gets it wrong and you skipped it because you didn't. You only give it five minutes per person instead of three hours a person. I get why you want it, the consequences for somebody else are far worse with the AI model.
PJ:in either case, whether I did it myself or used a tool. If the thing I represent as this work is wrong, I should get reprimanded. Like, no question. And I think that is, you know, if it saved me going from three hours to like a half an hour or an hour, like that's a huge productivity me.
Rob:Okay. I will buy that. If, if you'll take responsibility for it
PJ:I
Rob:then,
PJ:way you can look at it.
Rob:but a corporation wouldn't take res, a corporation wouldn't take responsibility for it. That's my point. so in this case, okay, and this falls back to where I, I've always said is AI and little use cases is quite useful. These big, large language models that kind of do everything badly aren't that useful. There's much better use cases for AI in solving specific localized problems, and that that kind of fits. It's a big one. It's starting to go down the RLM model because it has to be able to read check-ins and things like that. But it's, it's a, it's a closed world use case where yes, it's the big backend and it has a lot of access to a lot of data, but it's, it, it's scoped. There is a scope that it, it fits in. Um, kind of fits into what I've always said about, like I said, AI has some very good use cases in very specific, realms, but. Me personally, I'm on. I mean, right now it's like you like to use the copilot stuff. I've tinkered around with it and I don't like it because right now I'm working on mission critical code that has to pass DO 1 78, that has to pass FAA requirements of, it takes me just as long to verify that this would pass than it does to just write the code in the, in the, uh, first place. You can't, uh, say to an ai, write me a mission critical, do 1 78 FAA compliant navigation system.
PJ:And I think that's I think that's a hundred percent fair. There is, to your point, a small set of use cases or there's scoped use cases where I want it to do very specific things and it's not a human language compiler where it can do exactly that specification and spit out exactly the code at the other end. it'd be interesting to see someone training to do like that to happen.
Rob:Again, training data. This, in this specific case, there's not enough example code out there.
PJ:And it, I think this is a, there, there's an interesting case here where like you sent me the, uh, you know, the c plus plus rules for, I can't remember, it was for the joint strike fighter, right.
Rob:it was It was, Lockheed Martin's doc, uh, document. Yeah. It's a public document as to, and it's very nice rules, but you could take a programmer that document and say, now write code, and you'd write code differently to what you wrote without that document. And you could be like, I'm aware that I'm obeying all of these rules today. You can't do that with an ai. You can't say, take that code and write it in the style of this document.
PJ:it's an interesting question of, where like this sort of, you know, that document could be considered rules of an expert system.
Rob:Literally are, but there's a lot of contention to, yeah, they conflict. What do you do when this one is an exception for this one, and which one takes precedent, blah, blah, blah, and why? And a lot of it is document the reason of, like, I made myself an exception to this rule because of this. I have to call this external library. And I, I rules conflict with their rules, whatever it may be. There's all this crops up all the time and, and AI coding is terrible at this. So imagine how bad it would be booking flights or figuring out when your kid comes home from college and to get them home safely. It, it's just not gonna happen and Siri's not gonna get it. Alexa and, okay, Google seem to have gone away. So Siri's the only one left and he's not getting this level of. Smart ever. And if she doesn't get it, she stays useless, she stays irrelevant. I'll just ask her to play Music is all I do now. It's all I will do. If they make a new one without sui, what's the point of Apple intelligence? It's all these little localized problems which AI is good at. Is, is that what they're gonna do? Is that what Apple Intelligence is gonna become? Remove this object from a photo. Good use for an ai. Remove, remove the audio from a song so I can sing over it. Silly use, but technically a good use is that what Apple Intelligence is gonna become. It's just a bunch of little bespoke problems that they solved with localized AI models that they can run local, that they can train on their shitty apple hardware and you can run them on the a and e on your Mac, on your Apple tv, on your phone, whatever it may be. I think that's the path they're going down. It's not a coherent apple intelligent, it's, it's a little tiny. Trinket of a tool that they could put under the Apple Intelligence badge. I think that's what it's gonna be.
PJ:the sense that I think that's all these tools are going that path, and think it's bad in the sense that, you know. In Linux, I can go and pipe said a rep. All these things I could just like put together to formulate like one bit to the other. I didn't like, I don't want the monolithic program. What I want is to know when to use these tools in particular situations and be able to like go and do that thing. And it does it great.
Rob:It does but that's what, not what Apple's doing.
PJ:get that.
Rob:You can't it, you can't get the edit list that it made to your photo to remove the lamppost in the background. It's just something that it does. Can you command scripted from the command line? Probably not. It's you. These are very disjoint, bespoke tools. All user interfaced. You can't combine them yourself into bigger tools. That's just not an Apple thing. Yeah, it's,
PJ:the
Rob:it's, it's,
PJ:you know like, like something's gotta give in some constraints somewhere.
Rob:no, apple Intelligence is gonna be a bunch of useless trinkets that some are useful to some people, but generally aren't useful to anybody. And that's what they'll do. They'll add new trinkets under the Apple Intelligence banner and that's what it will be. I mean, technically, what is Apple intelligence? Why do you need special hardware?
PJ:I, so, I mean, I'll be very honest
Rob:We've talked about this before. It's like my old, my old phone, my iPhone fifth 14, I have in my hand right here. It could do the job. It take a bit longer. It might not be a great experience because it's gonna sit there and churn, but is that why they limited Apple Intelligence to the 16? And this comes back to the walled garden. They can do that because we have no control over our own hardware. Uh, but I'm pretty sure that there's nothing in any of these apps that they've released under the Apple Intelligence banner that wouldn't work on my older hardware. It's just they don't want it to work.
PJ:So at some point in time, then I think Apple will make a change. I. it becomes an existential threat, and I think it's a question of whether they believe there's actually an existential threat here. there's not, then I, I agree. Apple intelligence just becomes a set of trinkets that are just operate independently. If there is an existential threat, then you have to say, am I gonna release the privacy constraint? Am I gonna release the script ability constraint? Am I gonna release the API constraint? Am I gonna release the Nvidia constraint? Like, which thing am I willing to let go of in order to create something competitive in the marketplace? Because in my mind, like this thing isn't, like, if it's vaporware, then if it's an, if it's not an existential threat, it doesn't matter.
Rob:It should stay vaporware is what it should do. And secondly, these existential threats, it's are they real of like, Microsoft missed the mobile space and for a while it was like, oh, it's gonna be the end of Microsoft. They, they came out just fine. They, they pivoted, did other things. Yeah, they tinkered with it. And I think the, the, the Windows phone is gonna be the equivalent of Apple in intelligence. It's gonna be a whole fast attempt at this new tech that we are not a part of, but we desperately want to be a part of. And it's gonna end the same way. And Apple will just go on and do services and no one will, everyone will forget they ever made it.
PJ:And I think it's a, a incredibly valid way. Again, like they said a year ago, we're just not gonna touch AI right now, we would've been in the same place except we wouldn't be ragging on Apple. Right.
Rob:but we won't because we'd be talking about how bad the stock price dropped and the lawsuits that the investors filed against the CEO, blah, blah, blah, this. A lot of this is driven by, we have to be involved because we're a public company.
PJ:I truly, I, you know, this is like the one spot where I really, I go hard in the other direction where I think Tim Cook could have said, look, we don't think that there's a great use case right now AI that we can ship. So we're gonna wait until like there's a clear market leader or a clear like paradigm shift for us to go after. worry. We'll put our best people on figuring this out. I think that would've been enough. again,
Rob:I agree.
PJ:is all counterfactual at this point in time. So I, I'm wondering how much like the, the, like the, the real stock price is. Because what we're saying is the stock price is being buoyed by vaporware. So now it's actually possibly worse
Rob:Oh yeah. I, I, I'm not disagreeing with you, man. I'm just saying that would've happened. Steve Jobs always had this attitude where he didn't care what the stockholders thought of, like, I'll do this, but now they do. Um, I, I think it's just like the change in leadership over the years at Apple has become more, not Apple, like, apple started off as the counterculture company and everything was different and the whole think different thing. That was true when Steve Jobs was there. But like I said this before, today Apple runs on the ethos of this dead guy who's been dead for 10 years, and it's still like, they kind of, well, this is what Steve would do. But is it without him being alive to ask him? You don't know. So it just kind of becomes this almost folklore thing and it just kind of just gets dissolved and it's just more corporate speak at this time.
PJ:that's there that have what
Rob:Yeah.
PJ:have done? Quite honestly, I mean, I, I think. Apple's stock is in vast more danger if people really start to hammer on where the hell is this apple intelligence thing. Like you made a promise and then failed people are going
Rob:hardware on a promise and people will remember that.'cause that phone was a thousand dollars.
PJ:Right.
Rob:And what did they get? They got nothing. They'll get nothing because they won't be out till the next version of iOS at most, at best. Sorry. And there'll be a new phone then too.
PJ:this is, I think this is actually dangerous for them because, you know, we're looking at two promises broken in the last two years. was the notion of the Apple Vision Pro being something like the new zeitgeist that was gonna create all this revolution that didn't happen. they, I think, very recklessly pivoted. this Apple intelligence stuff and said, Hey, we're gonna, we're gonna take the, the thunder from all this AI stuff, and that hasn't happened yet. So, you know, what's the next promise that they're gonna try and make? Because now the pressure's on, like the, the right thing to do would be to kind of back off, hold off the horses, even if it takes a stock hit, just maybe take some more conservative, cautious steps. If they make a bigger promise on top of that, they gotta deliver on it
Rob:Well, that's what they're gonna do. What, how often does Apple backpedal? Very, very rarely. Um, yes, they've delayed the new series and they've delayed Apple Intelligence. The key features that we still don't have, uh, they should have just come out and been like, apple Intelligence is, they should have said it originally. To be honest. Apple Intelligence is a collection of smart utilities and we'll add more over time we'll start with these, with a few here that we can, and basically be honest is what we're saying. We're not saying, oh, that anything more than don't bullshit us with produce videos. Go back to live presentations of live demos of what it can do today. If you can't sell it today, don't sell it today. And people make mistakes. No one's saying you can't make mistakes. It's just, don't bullshit us.
PJ:it's, I think them coming clean about this, you know, doing a realignment, doing some Maya culpas and say, look, we're gonna, you know, will do better, I think goes a long way. Yeah. There's some short term consequences. But I think in the long term, like it's like it's better than trying to keep making up for shit by making bigger and bigger promises. Because when you do fail and fail big, that's when companies really get hammered.
Rob:Yeah. And I think, I think this is, like I said, this is apple's, uh. Moment that Microsoft had with mobile, and they'll be fine in, in in the long run. They'll be fine, but won't be a company. You think about when you think about ai, just like Microsoft isn't a company you think about when you think about mobile,
PJ:fine. It is a totally fine thing not to think of Microsoft and Mobile.
Rob:it's, and I I do think for Apple to be a big player in this, everything needs to change. The world Garden needs to change. Let other people do this because they're better at it than you are. It's not one of these things is the cause of the problem. It's a little bit of everything and it's death by a thousand cuts.
PJ:I agree. I think it's, I think it's the overlap of all these constraints and I don't know if you need to remove every constraint, I think you need to remove at least one and make a decision about which one you're gonna remove. Again, do you wanna remove the privacy bit? Do you remove the closed garden approach? Do you, know, tone down the promises you're making? Do you go to Nvidia hardware? I mean, make a decision someplace, but you're backing yourself into a corner and it's hard to get out. At the end of the day, it's, it is okay. It's natural and good for, for companies to say, look, we made a mistake here. Like we're gonna do something better. We're gonna match what the marketplace wants. Like that's really understandable. I think trying to do it all, that's a great recipe for disaster. I.
Rob:For sure, and we'll see. WDC is right around the corner and. Maybe I'll have to eat every word I said'cause they'll come out with this perfect solution that we've not thought of. And, but, I'll believe that when I see her.