Tech Made Fun | Tech Podcast By SK NEXUS

TMF 019 - Apple Intelligence - AI for the 'Best' of Us - All You Need to Know About AI in Pakistan

Saqib Tahir Episode 19

Apple's new "Apple Intelligence" suite introduces a range of AI features, including AI writing tools, an improved Siri, enhanced media search, and image generation capabilities. In this episode we go over what AI means for you as the Mango Person (Aam Janta) and why the way Apple did it is noteworthy.
Hosted by Saqib Tahir
Support the show: https://support.sknexus.com/

Read full show notes here: https://sknexus.com/ep19/
Read companion article here: https://sknexus.com/apple-intelligience-for-the-average-person/

Further learning and references
Apple Intelligence
Apple Intelligence: every new AI feature coming to the iPhone and Mac
Apple Intelligence is the company’s new generative AI offering
Everything to know about Apple's AI features coming to iPhones, Macs, and iPads
Introducing Apple Intelligence, the personal intelligence system
Apple is putting ChatGPT in Siri for free later this year
Apple's AI for iPhone Just Showed Google How It's Done

Private Cloud Compute, Privacy & Security concerns
Apple is promising personalized AI in a private cloud. Here’s how that will work.
Private Cloud Compute: A new frontier for AI privacy in the cloud
Apple is building a high-security OS to run its AI data centers - here's what we know so far
A cryptographer’s look of the Apple Intelligence
WWDC: Apple’s Private Cloud Compute is what all cloud services should be

Support the show

As always -
Thank you for listening, please send any questions or feedback to podcast@sknexus.com
See you next time.

Subscribe to the newsletter: https://sknexus.com/subscribe/
Keep the show running: https://buymeacoffee.com/sknexuspk

(Transcribed by TurboScribe.ai. Go Unlimited to remove this message.) Welcome to another episode of Tech Made Fun. AI has been going on for the past two years. Almost, I think, AI has started getting attached to everything. And I think I've talked about it before in a lot of episodes. Because, you know, we used to cover tech news before, so AI used to come up again and again. Now, though, Apple recently, finally, after a year or two, they have implemented AI into their next version of operating system. And because of that, I thought of making an episode in which there are two things. The first part is to give an introduction of AI and a general explanation. Because, again, the goal of this podcast is to raise general awareness about tech. So, everything you need to know about AI, I'll go over that. And then, the second part of the episode will basically be about Apple's implementation and why it's so unique and different and it's very important to talk about it. Okay? Let's get into it. So, first of all, the hype we have to discuss about AI is about Generative AI. What does that mean? So, about 2-1.5 years ago, OpenAI, a company, released ChatGPT. Okay? It was a chatting product on which you can go and talk to an AI chatbot. You can ask questions to it, chat with it, or, you know, gather general information. And it took off. I think it became the most downloaded app within a week. In the entire history, right? And there was a lot of hype. All the LinkedIn bros started saying that this is the next thing. All the jobs went down. Everything went down. Destruction and all that. And now, 2-1.5 years have passed since then. And we are at a stage where, you know, hype has died down. As expected. Actually, there was an article recently that 80% usage has fallen. Because, you know, people used this ChatGPT thing when it came out. And I'm sure still people do. But the majority of the people, they don't really care about it. And it was just too obstructive. Right? That you go and open a separate app and then chat in it. It didn't integrate well with the things they used in their existing workflows. Yeah, in their existing apps. Right? To fix that, what happened is, Microsoft started on a mission that, Bro, now we will put AI in everything. They made their own thing, CoPilot. Actually, it was an old branding. So they did a new branding. They launched CoPilot in everything. In their Microsoft Office apps. In email, Outlook, etc. So on, so forth. And that raised a big question. That, man, it's been 1.5-2 years till now. And I'm trying to bring this point again and again. Because 1.5-2 years is a long time, right? The AI, what is it? Is it a product? Or is it a feature? Right? And here I will talk about product management in a traditional fashion. So there is a concept of product management. Customer value. That whenever you want to launch a product, You have to provide customer value. You have to sell your outcomes. The feature, that doesn't matter. It's people's problem. They just need a solution. They don't care how they are getting that solution. Right? Like, for example, Most people who take their car to the mechanic, They don't care what the mechanic does in the engine. Or what he does. They want that, bro, my car is fixed now. Its problems have been solved. And he will give a general overview. The mechanic will say, I did this and that. And we all bring our own. As we understand. Right? Most people. Most layman people. Don't care about how things work. They don't care about how many gigahertz are there. And how many gigabytes are there. They care that, okay, I buy an iPhone. So its camera is very good. Yes, it's the best. The one that Apple is selling now. Its camera is the best in the iPhone. Yeah, I buy Samsung's latest phone. The Ultra is the best. Yes, it's the best. Whether it's good for that person or not, That's a different thing. But generally speaking, When it comes to products. The layman person or the majority of the consumers. Don't really care about how it works. They just care if it works. And that's the biggest problem with AI. So far, AI has not been successful as a product. That as a standalone product. Which has nothing to do with anything else. People are using it day to day. Not really working. So what's happening now? That people have understood. Who are running all these models. That it needs to be feature based. And what that means is that. The existing things that people are using. Integrating AI in that. Is the way forward. Right? You must have seen this example. Meta recently. Installed Meta AI in WhatsApp. Okay. So you go there. Click on Meta AI. And start talking to Meta AI. Right? You don't need to download a new app. You don't need to create an account. It's way easier. And we have seen that. Users have skyrocketed. Similarly, Microsoft. Like I said. Put AI in everything. And now. Like 1.5-2 years later. Apple has also put it. So this is just like. A brief history. What has changed in the last 1.5-2 years. AI came. Became. Super. Hyper. Fad thing. Bunch of products came out. Rabbit came. Humane. AI pin came. These were the two devices. Which were like. We will replace your smartphone. And all that. They failed miserably. One was a scam. Basically. Then what? Companies realized. That we need to integrate AI. As a feature. Into existing products. That was brief history. Now let's get into. Some definitions. So. When we talk about AI. Now. The thing is. Technically speaking. None of this is AI. Okay. Technically speaking. But what happens is. When one thing becomes a brand. Then people start calling it. The most famous example. Which I can give. When Tesla's cars came. Electric vehicles. So they came up with one thing. Self-driving. Autopilot. Or something like that. Although it was not self-driving. And it was not autopilot. But because Tesla. Popularized it. Marketed it as such. People started calling it autopilot. Although it was not autopilot. And similarly. Here too. Because of the popularity. That charge EPT garnered. People started calling it AI. When in reality. It's not true AI. So to differentiate all these things. They made three levels of AI. Which are very. Commonly used. In the tech world. And I want you to be aware of it. Because if you are chatting with someone. You have some material. That this is not AI. This is this. Or that. Or whatever. Right. So AI is now. Currently. In three categories. Broadly speaking. Defining it. First is ANI. AGI. And then ASI. So. ANI is. Artificial Narrow Intelligence. And this is what we have currently. All your generative models. Which are. Charge EPT etc. Your Siri. Google Assistant. Training models. Your driving assist. All the current technology. We call it ANI. Artificial Narrow Intelligence. Some people say. Artificial Not Intelligence. And then. The next thing we have is. Artificial General Intelligence. Which is hypothetical now. I don't think. Someone has achieved true AGI. And it will take a lot of time. The main difference between ANI and AGI is. What you need to understand. That in ANI. It's all about. The data. And training on that data. Okay. So if I. Show an ANI model. That man. Show two images again and again. Of a house. And a car. And then again and again. I trained it. That man. This is how a house photo is. This is how a car photo is. This is how a house photo is. This is how a car photo is. And over time. With enough training data. After seeing enough house photos. And after seeing the car photos. The ANI would get. More and more accurate. With knowing. That's a very. Simplified version. Of how to understand. Of how training works. You need more details on this. There is a very good video of CGB Grey. Go and search. On Artificial Intelligence. He has made a very good animation. You will understand very well. What happens at AI basic level. How to use training data. How to twist knobs. To you know. Increase the accuracy. Okay. So ANI is. Training based. But the AGI. Artificial General Intelligence. In that. Again. It's a theoretical concept. That it will have. Cognitive ability. Cognitive ability means. That it will have the capacity. Because right now. Whatever AI is. It doesn't have any context. It doesn't have any thought. It is just a random thing. On which it has been trained. It keeps talking. Or keeps telling. Or keeps showing. But the AGI. That will be the next evolution. Where it will be. Able to actually understand. What you are talking to it. And then it will have. Some cognitive ability. To understand its answer. Okay. And this is all theoretical. And it hasn't come yet. And then. Because you know. Everything has to be taken to the extreme. The next thing is. Artificial Super Intelligence. Which is. Even more theoretical. Even more future. Where basically. Your AGI. It will be so much improved. Or it will be so much capable. That it will surpass human intelligence. That's the key. So AGI. It will have. Cognitive ability. But. It will be at the level of humans. Or it will be better in some things. Right. But the ASI. It will definitely be. Above the capabilities of humans. Because its. Like processing. And after that. The data access. Will be more than a common man's brain. Again this is all hypothetical. But this differentiation. Is very important to understand. That ANI. AGI. And ASI. For future episodes. I want to cover. Progressive versus disruptive tech. And this is very important. And key to understanding. That the ANI that is going on now. It's. Kind of progressive right now. It's not a disruptive tech. When it comes to AGI and ASI. Then maybe we will go towards disruption. Anyway. This definition. It was very important to tell. And now let's move to. Two other things you need to know. Which is. Diffusion models. And transformer models. So currently. Like I said. ANI is going on. Right. And the two most common examples in ANI. Are the diffusion and transformer models. And in very simple words. Transformer model is. That. It's a. Kind of model. Which is really good at predicting. What's the next word. That's it. So it's like a very fancy search engine. So when you are chatting with. ChatGPT or MetaASA. How does it answer? It answers in such a way. That it has read a lot of data. It runs a statistical analysis. And thinks. And then it presents it to you. Very fast. So it looks like it's doing it in real time. But. Behind the scenes. The working of the transformer model. Is that it has absorbed a lot of data. And it looks at it. If I say this sentence. Then what should be its next word. What should be its next word. And that's how it completes. So it's a very fancy search engine. Or you can say autocomplete. Diffusion model. On the other side. That is for. Image generation. And video generation. So mid journey. If you have heard. Here you have. Google's Imagine. I think. To which you give a prompt. Create an image of the car. So it creates an image of the car for you. How does it work? It also works in such a way. That it has a lot of training data. It has thousands of pictures. And it has a little bit of. Understanding of this. That the car looks like this. So it will diffuse. Thousands and thousands of images. And slowly. Slowly. If you see it in action. It's a very interesting thing. Slowly it morphs. And becomes a car. And then it's like. Okay this. It's good enough. And then there is a little bit. Contextual understanding. Very little. It's not that he doesn't know. What a car is. He just trained it on keywords. That car is such a thing. And it tries to match that. From its training data. And tries to match the prompt. Okay. That's why. Like you will often see. That the diffusion models. The image video generation ones. They struggle with empirical stuff. Like numbers. Or text. Like if I say diffusion model. Then make me a picture. In which there is a board. On which Saqib is written. 99% chance. That he will not have to write Saqib. Because maybe in his training data. This will not happen. But if there is something. Which is in his training data. Then there is a slight chance. That it might get it right. Understanding this is very important. Because when these companies come. And they launch something. Like Sohra launched. Before ANA. We covered the episode of I was sleeping. They show such things in the demo. Which is like. Outclassed. And people get scared. That oh brother. This is gone. Jobs are gone. Everything is gone. But that's just a demo. Those are handpicked examples. The 0.001%. In everyday use. In reality. They have a lot of issues. Like still. If you are a little bit tech savvy. And you have an eye for detail. It is very easy to tell. That this is an AI generated image. Fingers will be damaged. Teeth will be damaged. So on so forth. But it's not that. It makes it very easy. That a scammer. Make some images here. And then go to Photoshop. And do some manual editing in it. So it makes starting. Very easy. And that's the whole point. About this AI right. Till now. That the AI that has come. Like I said. Transformer model. And diffusion model. Which is on ANI. They are really good. With generating text. And they are really good. With generating images. Which are not very technical. Or empirical right. And because of that. It's largest impact. Came on these two things. Which is like you know. Content that is based on text. Or content that is based on still imagery right. And there was a lot of debate on this. There was a lot of fight. And for the past 1-2 years. Cases are going on. And don't know what is going on. Okay so this was. Diffusion models. Or transformer models. Now just one last thing. I promise. Last thing. Which is very important for you to understand. Which is AI hallucinations. Hallucination is basically a term. Which you will often hear. That AI is hallucinating. So. It means that. Its output. Is wrong. Or it is making it on its own. Because these transformer. Or diffusion models. Are optimized. That both plausible sounding. Or plausible. Appealing images. Or text generated right. But often what happens. Verifying the text is a little difficult. In images it is clearly known. Like I said. Fingers will be wrong. Teeth will be wrong. Looking at the person. You will say. This image is not real. But in text. It's really hard. You will ask him something. And he will very confidently. Give you the wrong answer. And that's where the problem occurs. Because. People are lazy. They won't go and verify it. And what will happen. That yes. That copy pasta. Whatever prompt hit. His answer came. And gave it ahead. There was actually this funny story. That when the new chat gpt had come. So the lawyers were very happy. That oh yes. Now I have a case etc. It will help a lot in doing research. So there was a lawyer. He was asking for reference cases from chat gpt. And chat gpt. Invented all the cases by himself. And gave it to him. And he came. Without seeing or confirming. Submitted the case. In court. And found out later. Then he said. Okay, this was generated. And it had a lot of lash back. I think his license etc. Was snatched. Don't quote me on that. But. Basically. What I mean to say is. AI hallucinations. Is a very real problem. Because the ANA. And these. Diffusion and transformer models. Their purpose is. That they are creative. They are great storytellers. Or they are. Great for visual aids. Right. So they need flexibility. Creatively speaking. That they can work on these things. But. The risk on the other hand. Is accuracy. Because they are being creative. They are not that controlled. They won't be kept closed. They won't be restricted. So. They. Run into this issue. Where especially. Where there are more. Complex types of. Prompts. They hallucinate. Way too much. And there. The whole experience of the person breaks. I know so many people. Who jumped on the AI hype train. And they used AI. And then. After 2-3 months. They were like. Bro. The amount of time. I need to work with AI. I think I would have done it myself. Especially those people. In such an industry. Where facts and information. Accuracy matters a lot. Right. And that's just how it is. Which brings us. To the second part. Apple's integration. With AI. Right. So now. It was happening. That for 1-1.5 years. Everyone. Dot AI. Dot AI. Put this. This is feature AI. This company's AI has arrived. And everyone was waiting. That when Apple will do it. Then it will do well. So there was a lot of pressure. On Apple. That when that AI. Will do integration. It will do it. Which. Actually. Will be useful. And this is. Going back. To our earlier point. That. Product versus. Feature was the point. Apple's unique advantage. Basically this. That. It is a vertically. Integrated company. Which means. That they make their software. They make their hardware. They do everything themselves. They make their own apps. A to Z. Like. They have the entire vertical stack. And the benefit of that. Is that. When they bring AI. Then they had this expectation. That because. Everything of Apple. Is vertically integrated. So their AI. Implementation. Will be very good. So. Yes. In the last few days. WWDC event happened. They released AI. And in typical Apple fashion. They had to brand it. So they said. Apple is intelligence. Not AI. And. The reason. I was a little. Tickled. Was that. Their tagline was. AI for the rest of us. Which made me laugh a little. Because in typical. Apple fashion. We live in Apple land. They only look at America. And in America. Yes maybe. 50% users. iPhone. And so on. So forth. But if you look at the whole world. No one has an iPhone. So. I named it. AI for the best of us. Because. You know. Most people can't afford it. But yeah. That being said. Here are. Three things. About. What Apple did. Which are very special. Okay. First of all. I said. Because they were. Vertically integrated. There was a very big expectation. That man. Their AI integration. Will be at a much deeper level. Which means. That what you are already doing. On your iPhone. And macOS. And iPadOS. With that. AI as a feature. Will be added. And you don't have to learn anything. Complete. And you don't have to learn any new workflow. It will be very easy. To understand. And that's what they did. So they took out the primary features. They took out the transformer based feature. They took out the AI writing tools. Because you know. It's good with writing. It's gonna help you. Rewrite. Proofread. And summarize stuff. So basically. You think like this. You are writing an email. You have written something long. You say. Proofread it. There are no mistakes in it. Or summarize it. Or rewrite it. Now all this is happening natively. You don't need to go to another app. You don't need to switch any context. Wherever there is a writing window. Eventually this feature will be there. Just like they don't copy paste. By tapping. I think they will put it there. Eventually. With time. The second thing. When we talk about. Diffusion models. Now. In the diffusion model. Apple knows. It's scary. Right. Like I said. 4-5 fingers. God will take out extra. Will take out 2 teeth. Or. While making human images. Those eyes. Teeth. Fingers. Especially mess up. So what did they do? They put image generation feature. They limited it to stylistic choice. In animation. And illustration. And sketches. That way. They are kinda safe. No one makes anything wrong. But then again. Maybe. I'm sure someone will figure out. How to hack it. And. It must be coming in the news. That. Apple has made a disgusting picture. Or whatever. But yeah. Like. They. Did the smart move. Which is. You can make images. But only in this stylistic choice. We are staying away from reality. Because we don't want to mess that up. Yeah. That was their safe move. And as I said. Transformer models are really good with writing. And especially in writing. Rewrite. Proofread. And summarize. That's what they launched with. They didn't say that. Oh. Using Apple AI. You can write a whole article. Or write a journal. Like ChatGPT. And such companies were promising. In the beginning. They were more like. Look. If you are a writer. Or it will help you to write better. And that is. A very good use case. Of the transformer models. So. This is all to say. Apple did deliver on their expectations. They had a really. Deeply integrated. Or good. Sprinkling of AI. As they say. Within their ecosystem. They didn't say anything over the top. Didn't say anything extra. And. They did what most people expected. That this will be a useful feature set. For those who want to use AI. And. Useless and fat stuff is going on. We will stay away from that. And that's what they did. The second thing I want to highlight. From the Apple event. Private Cloud Compute. So basically. These transformer and diffusion models. To run them. You need a lot of resources. Now. Thankfully. The iPhone. It's already. Pretty strong. Pretty performant. But the issue is. That. There will be times. Where the request. Or the thing that needs to be worked on. For that. Its computing won't be enough. Or for that. If it starts using the computing. The iPhone battery will die. It will overheat. And. There can be a hundred more problems. Okay. About which. The customer will say. My iPhone has started to explode. And then. News headlines. And all that. You can imagine. So Apple did the extra effort. I'm pretty sure. Because of this. It took a year and a half. To release them. They built their own. Ecosystem. Of. AI outsourcing. So. They made. Private Cloud Compute. Which is in simple words. That. If you make a request. Which is very excessive for the phone. So that request. On Apple's own servers. Which is silicon based. It will go on that. And go there. It will be processed. And then you will get the result. And they did this in a. Very secure way. Because that was the issue. A lot of people had. That my data. Should be on my phone. If it goes on the server. Then you know. It can get shared. And so on so forth. So the way they did it. It's a very good way. If you want to learn more about it. We have an excellent article. Attached. On the website. There will be a link. In the description. Second and third. You can go there. And read the entire article. It is a bit more technical. For the average person. But what you need to know. Is that the way Apple did it. No one has done it yet. It's simple. And this is not me. Fanboying over Apple. It is actually very impressive. Like. Even the cyber security community. Will be a little impressed. That they have made so much effort. To release these AI features. So that was private cloud compute. The last thing. Which was very misleading. In my opinion. And a lot of videos. Made fun of it. And all that. But which is. The chat GPT integration. So. One more thing. Another level. To provide a better service. I guess. And the time was less. And the hype trend of AI. Had to climb. Because Apple usually releases. Their OS around. These few months. So they. For the time being. Integrated it with chat GPT. But this is a very limited integration. And this is something. That no one is openly telling. In fact. I have seen some videos. It was so frustrating. That they are openly misguiding. That Apple's AI. Is all running on chat GPT. Even Elon Musk. He also started such a big drama on it. No. It's not like that though. What it is. Is that. Chat GPT is just an extension. Of Siri. I believe. And basically what it is. That if you ask a question. And Siri doesn't get the answer. Or it takes too long to process. It's gonna ask for your permission. That brother. If you need this information. Can I go ask. My brother. Chat GPT. And get it back to you. And then you have it in control. That okay. Yes. Go get the answer from chat GPT and come back. Or chat GPT. You can use advanced functionality. It's all permission based. It's not forced upon you. Okay. And the other thing again. Because. It's Apple. Privacy is important. It will send. Very limited metadata. To chat GPT. And they have committed. That everything will be stateless. Or serverless. Whatever you want to call it. And nothing will be saved. On chat GPT. Or OpenAI servers. That is. Still a bit fishy. I wouldn't say. That claim. Is 100% true. Because unlike. That private cloud compute. In that. Apple is doing everything by itself. They have their own servers. They have their own everything. They have their own ecosystem. That is more believable. And that is also verifiable. By external people. But this thing. You have to take Apple's word for it. I think. It'll be fine. Because again. The reputation risk is not worth. The data that. OpenAI wants to save. iPhone user queries. So those were like. Three things I think. Were very important to highlight. Which Apple has done with OpenAI. And what I'm excited for. Is basically. Like. I don't use an iPhone. But I do use macOS. Sometimes. And the integration between macOS. Is what I'm really interested in. And to put it simply. I've been a Windows user for. I don't know. 10 years. 12 years. Something like that. More than that. I'm pretty sure. But the problem is. That in Windows. Because of this. Co-pilot thing. And Microsoft's. The mess ups and screw ups. On which there can be. A whole separate episode. Because of that. Like. I am and many people. Very frustrated. With Windows direction. Especially. I haven't even updated to Windows 11. It's simple. But when I see. Apple's side. And the way they've integrated AI. In a safe. Private manner. And not forcing you. It's up to you. If you wanna use it. And not like Windows. Where everything is forced. And turned on by default. And like. So on so forth. It makes me wonder. That. I think a lot of people. Will shift to Mac. Just because of this. Because. What's been happening for the last 4-5 years. All the hardware. Smartphones. Laptops. Computers. There wasn't any significant difference. In performance. Or in gains. Like. I have a 5 year old phone. It's working fine. No issues. And I have. A 4 year old laptop. It's also working fine. But this AI thing. And. If. More companies figure out. How to do it. As a feature. And integrate it well. The next thing. Or. Next competition factor. In the next 3-4 years. Will be AI. Which company's. Device's AI is better. Better in Xiaomi's phone. Or better in Samsung's phone. Or better in Google's phone. And we've already seen this. For the last 2-3 years. Samsung's Galaxy phone. Didn't sell well. In S24 Ultra. They partnered with Google. And launched AI. Although. All that AI was already in Pixel. But because. Samsung's. A very big customer. Compared to Pixel. Got a huge shoot. Like. S24 Ultra's. Outsold. Like. All the past. S series. I think. For the last 2-3 years. Just because. It had AI. Photo editing features. And AI. Other features. There. And that's exactly. What I'm saying. Like. I think. Understanding AI. And what it actually is. And how it is useful. Is very. Very important. Because in the next. 3-4 years. All our. Purchase decisions. Will be based around. Which device. Has good AI. Will it be useful for me. Or not. Am I wasting my money. Or am I buying something. Which I won't even use. And lastly. And most importantly. The things I'm buying. How much is it spying on me. Anyway. There are some questions. I want you to reflect on. If you're listening to this episode. Generative AI. Have you ever heard of it before? It's not possible. That you haven't heard of it. But. I'm sure. There are people. Who have just seen this Apple event. So they. Or then. Remembered what it was. Or then. They started to care about it again. Right. So was it your first exposure. Or first. You must have heard about AI. And second thing. What I would be really interested to know. Is that. Do you use AI. In your current workflows or not. I do. And I will talk about them. In today's after show. But I'll look forward. To hearing your responses. As always. Thank you for listening. See you in the next episode. So yeah. After show. Where I want to talk about today. How do I use AI. So. I hate hype train. And fads. And all these things. A little bit. Anyway. I hate it. Because. You know. I've been in the tech. Ecosphere. For a long time. And you get jaded over time. Right. You get tired of. This is new. Revolutionary tech. Which will ruin your life. And I take my time. Before I jump on any train. But now. After like. One and a half. Two years. As I said. I am actually using AI. In some of my workflows. Which is a good use case for AI. As I explained earlier. The transformer model. I think. It's better than the diffusion model. So whatever text related things. For that. AI is very good. And that is exactly. What I use it for also. So if I give the most recent example. My episodes are in Urdu. As you know. But the problem with Urdu. Is that Urdu is not a good. Index language. What that means. Is that if a person. Searches for episodes on Google. Or searches for subtitles. Urdu won't show to them. In Google search results. So. I use this app called. turboscribe.ai With which I translate this episode. With English subtitles. They are not 100% accurate. But the nature is such. That I speak half English in between. So. It becomes a good. Easy to process transcription. And I attach more episodes. The other thing I do with AI. Is just general research. If I want to do research on any topic. If I want to explore any new learning area. I use AI's solutions. Now the problem with this. As I said. AI is very confidently wrong. If you ask it any question. Or do research. Using it. It will often tell you. Wrong or outdated information. So recently. 3-4 months ago. I switched to using perplexity. The reason I switched to perplexity. Is that it's a good thing. That they have made a limit. That all your queries are referenced. With articles for free. I think in chatgpd. You have to pay. For online access. And reference articles. Or reference source material. But on perplexity it's free. So now I use perplexity. For my research. Because whenever I write something. Or do research. It gives me a response. And then there are links to the article. Which I can easily click. And go and verify. The responses myself also. Because usually. The nature of my work. Or the type of things. I use AI for. I already have a good understanding. That what research I have to do. Or what I have to talk about. I'm just double checking. Or triple checking. When I go to AI. So it's really good for that right. But I would still suggest. That if you are going to AI for the first time. Then always take everything. The AI says with a grain of salt. Because it will be wrong. And last thing. Because of which I will say. I spent money on AI. That was my phone. So I have a google pixel. And there was a feature in it. 2-3 years ago. I think. Remove blur. So now what? I have two kids. And if you have kids. So. You know. Taking photos of kids. Is the most difficult thing in the world. So there is a feature in it. Which has two features. In fact. One feature is. That if you take a photo. If it's a little blurry. It uses machine learning. To unblur that photo. So it makes it. Like higher resolution. It eliminates the blur. It works really nice. Okay. So kids get clear photos. I saw that feature. I said. This is a must have. And the second feature. Which has come eventually. In every pixel phone. I think. In 6, 7, 8. That was. Best take. So what happens. If you are a pro photographer. Like me. You don't take one photo. You take 20 photos. Of everything. Because you don't know. In which moment. Which person's face is where. Where are the eyes. And it is difficult to set it again and again. So whatever. Pro photographers like me. They always constantly are. Tapping the photo button. That at least one will be good. In between. And I do the same. So now. Google has taken advantage of that feature. That in it. Best take has been removed. What is best take is. That it will analyze all the photos. And you can change everyone's face in it. If someone's mouth is closed. Or open. You have to change it. Eyes are closed. You can pick which face you want. And it stitches together in a photo. That is really nice looking. Yeah like. It's good. And this is probably. Was it money well spent? I don't think so. Because it's the same thing. I took it at this point. That man. Photos. Are something I like taking. And this will help me a lot. And now the thing is. If I didn't buy a new phone. My other phone. This feature eventually comes on it's own. So I don't think. It was money well spent. But it was really good to know. The direction this stuff is. Heading in. To just have a better understanding. That how can this AI be helpful. So like yeah. I still wanted to try new things. And I wanted to buy a new phone anyway. So maybe I gave AI tax. 20-30 thousand extra. But you know. It was really good. When I bought it. But what I'm really excited to see. That this Google Pixel AI feature. Like it's revolution in Samsung. And now in iPhone too. With the next update. AI features will come. So the rest of the phones in the market. Xiaomi. Redmi. Oppo. Shoppo. Our local market phones. How these AI features are introduced in them. Because for the past 10 years. They have been writing behind their camera. AI camera. But what it was doing. Is that they just put a filter. By detecting how the scene is. That too does a really bad job about it. But now I want to see. If they take out any actual AI features. In their phones or not. Because. If common people get access. Then you know. It's good. Better for most people. And talking about most people. You know 95% people. Who listen to this episode. They don't really subscribe anywhere. I understand the reason. Look. It's a podcast. It's audio. You're probably not listening to it on YouTube. That's why. There is a subscribe link. At the top of the episode. Where you can subscribe. How that helps you. Is that every time a new episode goes live. You get an email. And it helps me know. That there are people. Who are listening regularly. Right. So yeah. If you can. There is a subscription link. At the top of the description. Please do click that. Or. If you're listening to this on YouTube. Then. Like. Subscribe. Press the bell button. And everything. Okay.

People on this episode

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

Accidental Tech Podcast Artwork

Accidental Tech Podcast

Marco Arment, Casey Liss, John Siracusa
The WAN Show Artwork

The WAN Show

Linus Tech Tips
Waveform: The MKBHD Podcast Artwork

Waveform: The MKBHD Podcast

Vox Media Podcast Network
Robot or Not? Artwork

Robot or Not?

John Siracusa and Jason Snell
Darknet Diaries Artwork

Darknet Diaries

Jack Rhysider