I think this Apple WWDC conference is very interesting:
Powerful
Intuitive
Integrated
Personal
Private
it has to be built with privacy Apple, This company certainly has a good job.
It is very valuable. One of the most valuable things about it is its insight into a new technology.
Before putting this technology into their own products, they will pay attention to the market and what the advantages and disadvantages of the same type of products are.
Basically, these five points are what he noticed, and then he will put the AI into their product based on these five principles, In this case, you will first see Apple
The cornerstone of the personal intelligence system is on-device processing
He said that most AI requests will be calculated on the on-device. If the on-device cannot be calculated, he will have two options.
One way is to use Apple’s server. The other way is to use ChatGPT’s server. But as he mentioned earlier, it is an architecture.
A cornerstone requires on-device processing, so I think Apple is forced to release AI this time.
I don’t mean that he won’t publish it, but he may publish it later because it’s obvious that what he wants to do is to not go through the server.
Not through ChatGPT’s server, not through Apple’s server, everything is calculated on this iPhone.
The problem is that his chips are not powerful enough to do this. Even if he does it, he will have to calculate it very hard.
It can provide a result. In this case, it can only send some data to Apple’s server.
Or use ChatGPT. I believe that in the next three or four years, it may put all the calculations into a mobile phone.
How did he polish up the matter of sending data to the Apple server? He told me that the privacy would be 100% private.
Then third-party companies can check and audit it at any time. To be honest, these are very bold claims.
And these claims are that Microsoft and Google are not brave enough to say these things. If they say these things, they will go to court and be sued.
So if Apple can tell that what he says is 100% private, he will try his best to make it private.
That’s why the AI released by Apple is so important, and all the servers Apple handles are Apple Silicon.
Their chips come, which means there won’t be a third company to provide these services. It’s just Apple that does everything for you.
So it becomes that Apple itself is a guarantee of confidence.
Siri
It’s 13 years since Siri appeared in 13 years, and then compare the Siri in iOS 17 with the Siri 13 years ago.
The difference is not that big. It just adds more Siri shortcuts and can provide some information from its database.
And the rest of it, Siri is a damn stupid voice control function. Then they upgraded to Siri first.
Wow, it’s too late. It’s a bit outrageous. The solution is to discover this thing so early, that is if they take Siri seriously.
The explosion of Large Language Model will not come from OpenAI, but from Apple
Type to Siri
With just a double tap at the bottom of the screen, I can quickly and quietly ask if This Type to Siri feature existed before
I don’t remember how long it took to get to iOS. We didn’t know why apple canceled it. Then it ended up republic this again.
App Intents API
And this won’t be limited to apps made by Apple.
For developers, they’ll be able to use the App Intents Framework.
It is the API of App Intents.
As early as two years ago, in 2022 or before, there has been something similar. It means that you go to the Siri Shortcuts app.
It turns out that when you complete these Shortcuts, there will be a lot of Actions for you to choose from. This pile is the so-called App Intents.
You can arrange the Actions you have just done into Shortcuts and then press a button to do all the Actions you have done before.
That is to say, every button you press on the App is an Intent, and it is a function that the App can perform.
The key point of becoming an AI is how you link the Large Language Model to the App Intents.
After you finish talking to the AI, does it know how to do all the things you asked it to do?
This one is Apple Intelligence’s most difficult and most worthwhile thing to show in terms of Actions.
But the press conference we watched was a video. It wasn’t a Live Demo. So it changed to us before it was officially launched.
We have no way of estimating how close and reliable a Language Model and Actions are.
What is the weather look like for tomorrow at Mere Beach? Oh wait, I meant Mere Woods
The forecast is calling for clear skies in the morning
At the same time, you will notice that he only talks about one Action every time.
Is there any way for you to summarize all the Actions and integrate them into one Prompt? Then you can post four or five Actions from the Prompt first.
This is something I am looking forward to. I believe that Siri will not have this capability within a year.
Then she demoed this one to retrieve flight information in the message, then checked the flight information in real-time, and then sent the message.
Then in the message, look for some information about Lunch. At this time, there is something that affects the User Experience.
It’s about the Delay and the speed. To be honest, these questions I asked him are all related to one device and can almost be handled.
That is to say, when it comes to a press conference, there is no demo to the side. The part is processed by Apple Server.
I guess it needs to send the Request to the Apple Server. It will take a long time and it will take a while for a Result to come back.
So how complicated is it for you to talk to AI? First, you need to send it to Apple Private Cloud Compute.
They didn’t even talk about how fast it is with Private Cloud Compute and how fast it is when sending back.
Is it possible to achieve a Seamless Experience? To be honest, I still think it’s Skeptical
I doubt that it is really useful. It is really useful. He said it is useful. He showed that it is useful. And I imagined it.
Even if Steve Jobs were still alive, he would think that your load took more than three seconds.
Everyone will find it completely unacceptable. How long does it take for such a Prompt to obtain a Result? It is not mentioned above as a Demo.
He didn’t mention this Response Time at all and ChatGPT went to GPT 4.0
The speed is so fast and so accurate. It’s Apple’s Large Language Model.
How accurate is the number of awakenings? I still feel doubtful that you will notice this time.
He just showed a Process Time. After he pressed it, he loaded for a while and then loaded out. But what is this part?
On-device processing must be Sent to Private Cloud Compute, which is not mentioned here.
Smart Reply
Smart Reply in Mail You will now see suggestions for your response Based on the email
As for the Smart Reply function, I feel it is quite weak because you are using your iPhone when you are typing.
After you type a word, they will have suggestions on it. Their suggestions can already help you type an email.
To be honest, this Smart Reply only uses a Large Language Model, and then the effect may be much better.
It’s just a little icing on the cake. Let’s talk about this Genmoji. One of the things you can use
Look like a human or look like a Friend. Look like yourself. Go and make an Emoji. I will feel like that.
If this is the case for you, what about your Memoji? I mean, you don’t need to use AI. Press a button.
Then you Generate a Memoji of your own. This is the most reasonable way to do it. You Generate a Memoji of your own.
Or a Friend Memoji is released. This Memoji can be used to make Emoji, or
Here are some other pictures from Gen. All the functions that users can use are completely ignored. All the technologies of Memoji are ignored.
Everything that everyone has done in the past, I worked so hard to put together a Memoji and it came out. I just picked a few details.
Then how big is the nose? And if you don’t ask AI to help me come out with a Memoji, you will notice that it is not brave enough to talk about Memoji in the entire press conference.
For example, go to this Image Playground, go to Gen Pictures, Gen those pictures of my grandma, or make your pictures like this
The style of IOS itself is not like that. Your style is already very obvious. It is the style of Memoji.
If you become stuck in the middle, like cartoons but not photos, I feel like you will see these pictures instead.
I feel so uncomfortable. Every year before, he talked about the Memoji software on iOS.
Here are some more shirts. Here are some more glasses and hats. Here are some more Accessories that can be worn anywhere.
Let’s talk about Memoji, it’s easy to use, but Gen AI is like this, so you don’t have to post everything.
He first talks about five things. One of them is Deeply Integrated, the Integration of your Memoji.
You haven’t finished the integration that has gone to the side. You have launched the unfinished integration. So you can use these pictures again.
You used Playful’s Expression or turned your Memoji into a Sticker. Isn’t that what Memoji is?
Those functions of yours have been put to the side. This system is not finished yet.
(Within your note) (ImageWand uses on-device intelligence)
(to analyze your sketch and words)
To be honest, I think this part of the demo is not up to par. Look at what he drew by his hand.
And the actual picture is not what I am, even if I don’t want to see it from this angle, it’s the same thing.
The second thing is that the system I drew is interesting and proud. If you have played with AI GEN drawings, it has a function called Stable Diffusion.
ControlNet has a model called ControlNet, so you can draw all the lines you draw.
This is how it turns into a picture. It turns into a very real picture with different styles.
Its function is not the same. It only uses AI to Describe what you drew and then add other things.
In addition, when GEN comes out with a picture, it doesn’t just turn your Hand Drawing into a beautiful picture.
I just feel so disappointed in this. He can’t use ControlNet, so then he’s in this position and he’s talking
You can do a lot of things, and then you will find a bunch of photos, and with these photos, he will help you create a short movie, like this
I think it’s a bit tricky. It’s not a new feature at all. Its new feature is just to make it easier to find related pictures.
Just use some Natural Language to find pictures. Its new function is this. It turns a bunch of pictures into a short movie.
They all have Memories in the Photos App and Memories in the For You page.
He’s doing this. He talks about it as if it’s a completely new feature. No, he’s just adding it.
It only becomes this part of Memories. It can be customized. You can choose the picture and call it a “Memory”.
(Are also coming to the Phone App) (And when you start recording in a live call) (Participants are automatically notified)
(So no one is surprised)
This time it focuses on the function of Call Recording and it can Recognize a Speech.
That is to say, he will Transcribe some text and then finally make a summary.
At this time, he didn’t make it clear. He said that when you call other people, you can record it.
Then when you press the recording button, the other party will know that you are recording a tight voice. This is how he said it. He will know that you are recording a tight voice.
How did you know that I recorded the tight sound? Did you know it first using an iPhone? If you use an Android phone, will your Android friends know that I recorded the tight sound?
He didn’t mention it at all here, and after you updated iOS 18, you still can’t use the Call Recording function.
I feel that this function is not integrated at all. If you use Call Recording on your Android, you will hear a beep during the call.
Beep, beep, what kind of presentation will Apple give to everyone? Here they are talking about other AI Tools.
Just like Check GPT, he can get all my data in the world. Ask him about all the data.
That’s his Apple Intelligence, his own Training Data, his Large Language Model
It doesn’t have any external information. It just has the information on your phone. If that’s the case, it’s an AI.
There are still some waste places that have become anything but relevant to existing Events.
Or if you get some information about this world, they will direct you to everything.
ChatGPT view means you are in the Background Result view. You will see Check Important Info for Mistakes.
That is to say, he has expected that ChatGPT may make some mistakes. So I think this is good for Apple when Apple and Integrate some technologies are implemented.
There is nothing wrong with this technology. Even before this press conference, I was thinking about how Apple would solve this problem.
Is there Fake News or a question with an inaccurate answer? Does he have a way to solve this? No, he doesn’t.
There’s no way around it, so he just outsources it to ChatGPT. That’s what you’ll see on this side.
Apple has no way to achieve its ideal AI. What’s wrong with it? It’s just like the Siri before.
If you have any questions, let me help you Google them. If you want to rely on Google, you can rely on me. So I will leave it to you and let me talk first.
Large Language Model and App Intents Action How can you avoid it, Large Language Model
What does it get? He won’t get it wrong. What did you say first? There are many App Intents in this System.
There are so many different Actions here. How do you prevent the AI from making mistakes and choosing different Actions by mistake?
All this makes me feel very interested. To be honest, after watching this, I doubt whether he can do it in the next one or two years.
Leaving ChatGPT, he said the same thing here, which is very interesting.
The thing I’m hesitant about is that he said he could use this ChatGPT 4o
ChatGPT 4o For Free
You can do it at home
If you use a VPN or you are in other countries, you can use it for free, but if you use 4o, the number of prompts is limited.
Maybe it can be used 30 times per hour. He didn’t even mention it. After all, you can use ChatGPT 4o for free.
Do you have unlimited usage? After using it a few times, I Downgraded to GPT 3.5.
These bits will have a good impact on the User Experience. You need to calculate GPT 4o.
You need a lot of money to get Power. Every Prompt you spend is free for people to use and you can use it at will.
Is it that big to just pick it up and dance on the street? This article also mentioned that if you are a paid member of ChatGPT
You’ll be able to connect your account and access paid features right within our experiences
If I use the free version of the system, it will be blocked or downgraded.
He has already mentioned that he will not have any records and will not store anything in ChatGPT and OpenAI.
Your requests and information will not be logged.
This should all be an Apple request.
He said that there must not be any Record. He clarified it, but you will change and believe it is no longer Apple.
Do you believe that ChatGPT does not store your information, so it conflicts with the three points he mentioned earlier?
your date is never stored
Used only for your requests
Verifiable privacy promise
But he said that every time you ask him something, he will ask you if you want to Send it out to ChatGPT.
That is to say, if you have consented to some of your data, it will be sent to ChatGPT. In this way, you will take revenge at that time.
You are looking for OpenAI. You are not looking for Apple. Here they talk about the App Intents API. Updated
There will be some Predefined Actions inside, which is called Apple Intelligence.
The Actions it can do or the Actions Siri can help you do are limited to this type for the time being.
It’s Books, Browsers, some good text or some good pictures, simple Actions like this
Then Siri can help you do something. What Actions it has depends on whether the Developer wants to write it into their App.
This will change the function of its AI. Its change and evolution will be relatively slow.
Because you just rely on the responsible developers of some apps on the App Store to add all the features.
However some developers and apps are not so responsible. Maybe after a year, they will have some free time and some functions will be removed first.
Even though his conference as a whole talked about the greatness of Apple Intelligence, in fact, the only thing he can do is slowly bring back Google Assistant.
This can help you do it, or it may take many years for these features to be released gradually. From June to September, it will be officially launched.
It only takes three months. How many developers will put this code into their app in three months?
It has changed. Everything is fine. It depends on what functions your Apple native App has. You can get those functions.
It will be very restrictive. For example, if you want to compare the content of an Event last month,
Upload to Dropbox and then send the link to Dropbox to your client. So first you need to use the iOS Mail app.
Then you need to have an iPhone for those photos. You cannot use other Photo Album apps.
Then we still have to wait for Dropbox to write these functions into an App Intents. Go to their App.
Then you can instantly share a link and send it to your client, which means they have found a good idea.
But in fact, there is still some distance between that ideal and the feeling of being a good personal assistant.
Because maybe you want to send a link to the client? Your client is the same. You haven’t made it clear.
Can he do this Action? Then you have to Send it to the Client. He must use an Email App, WhatsApp, or Message.
What will Siri do when you don’t know it?
This is AI for the rest of us.
The last time he talked about AI for the rest of us
It doesn’t matter. The previous sentence was about a Slogan for Macintosh. That Slogan at that time was
Computer for the rest of us means if you want something better, just buy a Mac
If you want better AI, just ask Apple. That’s what he means. To be honest, I think there’s something wrong with what he said.
Because of two reasons. Because it is a metaphor, AI is just like computers before.
The second thing is that you innovate something and you don’t know how to innovate a slogan.
Just to make people feel good and rest on their laurels, you used a slogan from decades ago. To be honest, he has almost finished speaking this time.
I just think that the AI he released in September must be Underwhelming, which means that he will not reach an ideal AI in one go.
It may have a stage every year, so it can add some new features and add some new features along the way to improve its AI.
I think it will be a little slower. I would say Apple is starting late, but they are heading in the right direction.
So I think the focus of this press conference is not the experience behind it. It is all about Convince. They have a good AI.
They don’t have a good AI, but before they start with Tim Cook’s Slide and Cragi, they have three Points, that Slide
You’ll see it becomes an Apple row. That’s the right direction. Go to a position and you can imagine it’s like this.
Siri, please help me transfer a photo I took on a boat in the Netherlands last week.
And Send it to a Friend on my Instagram named Peter. Then, Siri, you have to help me take care of the photo before sending it.
Help me with a higher Contrast. Then let’s get a pretty photo. Follow this and then send the photo to him like this.
Or you can help me book an Uber at three o’clock and take a ride on the street.
Then after I arrived, I sent a message to the following people. What if the technology progresses slowly?
This iPhone can become your Assistant even if you don’t even see a Mon.
This future is really exciting, and we can see such a future at this press conference.
It will change our habit of using the iPhone. Our habit of using the iPhone is to follow a Mon press press press.
It will become a very short time for me to look at a Mon one day. Then you ask AI to do it for you. The speed becomes very fast.
We will wait and see about this future.
Let’s talk about iOS, iPadOS, etc.
What about the Upgrade of VisionOS and so on?
2D Image Creating a Spatial Photo with Natural Depth
That looks stunning on Vision Pro
This time we are talking about the 2D to 3D function of VisionOS.
To be honest, I think this function is a bit Gimmick. Even if it is similar to this, we go to the theater to watch movies. I used to enjoy watching 3D movies. There is an issue.
But after watching 3D movies for a while, everyone is watching 2D movies again, because the focus is on the content. Well, it’s Zhang Xiang.
What’s the point of Photo 3 and 3D? I think this function will automatically turn all 2D photos into 3D one day.
This function already has its value. Look at what they do.
Then look at it like this. How long can you look at a picture? How long can you look at it after completing the 3D process?
Now I want to ask you all, have you ever tried to put these photos of travel on your TV?
Then the people who became a family watched TV together and watched the same clips that you shared while traveling.
Has anyone done this? This person brought Vision Pro and then the other person brought Vision Pro and looked at a photo album together.
But for this function, I understand its meaning. That is, you hold a mobile phone and then the mobile phone has some photos and videos.
Then compare the isolated friend and see this action and situation. You can simulate it on this Vision OS.
Or maybe one day everyone will have Vision Pro, and instead of sharing photo albums for people to see, you can just share some funny videos for people to see.
I think it would be great if everyone laughed together. I don’t know if I can do it. If anyone with Vision Pro can tell me,
There is a SharePlay function on Disney+ that allows you to watch a movie together, and this function is on Vision OS.
Do you want everyone to watch Disney+ together? Let me talk about these gesture updates. They are some quality-of-life updates.
Even if it makes it convenient and comfortable for you to use, it is not considered a major update.
But it was all placed at a press conference, that is to say, a few months after VisionOS was released, no breakthrough could be put into it.
Later this year it gets even better with higher display resolution and size and it can be expanded even further
To be honest, if you are using a Macbook, I think this function is very useful when you can have a virtual screen.
I’m going to buy some physical screens next time. It’s really like this one. All you need is a Macbook.
Then you have to do everything around, and there is a big Mon that other people can’t see. This is a Vision Pro application.
I think it’s a good idea to appeal to a lot of people, but it’s worth mentioning that I went to check out some of the forums in Vision Pro.
So many people have upgraded to VisionOS 2, and many people have reported the response time and Head Tracking.
It’s all a big improvement.
So you can work privately on your long commute or catch up on your favorite shows on a massive screen.
This function allows you to watch movies on the train. The first thing is that not many people will grab this Vision Pro.
I went out and took public transportation, but it might be better if it was private transportation.
In addition, as for Vision OS1, before the update, the only means of transportation that could be used to watch movies was an airplane.
When I got the plane, I could use it to watch movies. I felt like the first generation was launched before it was finished, and it became the so-called first generation Vision OS.
It’s a Beta, and it’s basically what everyone is trying out. Then what about Vision OS2? First, it’s the real Vision OS that everyone should use.
Just like the first version of Vision OS, it doesn’t support the mouse. That means it doesn’t support the mouse.
You also said that you own a computer, but you don’t let others use a mouse. But after watching this press conference, I will think about this Vision Pro hardware upgrade.
There probably won’t be any upgrades in the next one or two years because there are still a lot of software things to catch up and then the hardware will be found to be insufficient.
Then go first and then upgrade. The demand in this market is not very high.
This part can transfer the colors of all icons. There is also an icon in dark mode.
To be honest, this part is what I found most disappointing during the press conference.
The explanation is that Apple has always been a company that lets people know what it is called design.
Let’s understand the Steve Jobs era before. Many things couldn’t be customized on iOS. It was because Steve Jobs felt that this was the case.
Whichever comes first is the best. Everyone must agree that this design of mine is the best first, so I won’t let you change it.
What was the philosophy of Apple’s design in the past? What was the philosophy of Apple’s system? But things have changed.
You can repost all the icons with the same color. Do you respect the designer who designs individual icons?
Then each icon has a different style and each icon has a different color. There is a reason. The point is, you can just change the color if you like.
Because that icon is this one, it is this one. When you want to find this App, you will find this App immediately.
You immediately recognize this icon. That’s what it looks like. If you are used to the icons on your iPhone, they are all custom colors.
So you don’t understand the icons and brand recognition of these apps? Every app has branding.
Facebook is blue, Instagram is purple, and if you repost it all in yellow, what do these icons and designs mean?
I think it’s understandable when you say dark mode because some people like a dark background with a dark background, which doesn’t hurt the eyes so much.
And then it’s stylish. I think we can let some developers have an icon in light mode and an icon in dark mode.
I find it hard to understand that users can tint all the same colors in this way.
Even if you look far away for a long time, how do you recognize an icon next to an icon or a microphone? Seriously, do you teach any color?
Those icons all seem to be exposed to sun poison.
But in this control center, I think most of them are fine.
It’s not a big problem, but there is a small problem. I think the switches should not be round. Almost everything in the system should be round.
I don’t know how to explain that the design element on the side is round. Then the other buttons are not round.
It’s not an oval shape. It’s a design language that’s consistent. That’s what people Google says. It’s round. Then everything is oval or pill-shaped.
And it’s not like that. Some are rectangular with rounded corners. Some are just round. I don’t think you can change it for the sake of changing it.
If you focus too much on Apple rather than iOS, you will think how good these functions are.
But in fact, these functions were more than ten years ago. Some people have asked if they can be added. And maybe ten years ago, Android already had these functions.
These functions have become like a gift to you. I think this is a good thing about Apple.
This function matches physical accessories to an app.
I think it’s good because I play a lot with smart homes. Every time you use a smart home, there’s a new brand or a new app.
It will collect the information of all devices in your network environment with IP from 1 to 255.
It will share the information of all the Bluetooth devices in your Bluetooth environment. I think it is not secure.
And this makes people have privacy concerns. After having this function, the biggest point is that developers cannot easily bake this function out of their apps.
And on their accessories, I would be very disappointed if they released this feature and no one used it.
Or the developer may have to spend money to buy this feature. If I can use this feature first, I will feel very disappointed.
The function of this satellite transmitting information is very good. Many people may travel all day long to some very developed places.
It seems like there is no use for it. But if you prefer to go to the suburbs, there are many places on the earth.
There is no cell phone signal on the mountain a few kilometers away. Then this function will become a survival function.
Or keep in touch with relatives and friends. This is a very useful and important function. Of course, this function is not used in every country.
That is to say, when you are not connected to the Internet, you will send a message. And this function is free for the time being in countries that use satellite functions.
Even after upgrading to iOS18, I have used this function that can transmit messages via satellite. I don’t know if it will become paid in the future.
We are supporting SMS messaging via satellite too
It means that its satellites can help you communicate with those mobile phone signal towers, allowing you to communicate with the mobile phone signal tower until you can’t communicate.
Are you relying on a satellite to make a bridge? This is a powerful feature
like new ways to personalize your home screen.
When you look at the screen of this iPad, it looks blue. Even if someone opens the phone, they just look at the wallpaper.
Or you open your iPad and just look at a wallpaper. Then you open the app and use your phone.
You can ask for permission to remotely control your iPad or iPhone.
Here we talk about the function of remote control.
It doesn’t matter, not just iPads and iPhones can do this. This feature is very useful, but it also brings some risks of fraud.
Just explain why the iPhone has not been able to be remotely controlled for more than ten years.
Just because you can’t remotely control your iPhone. Don’t worry. Just because no one can remotely control your iPhone.
And then add in the fact that these AI frauds are so popular. If you add a remote control that is simple and easy to use,
With the function of a remote desktop, it becomes a scam and becomes easier. Even if one day you suddenly see these things on your iPad, you will feel depressed.
You’re going to scare me. The most famous thing about the iPhone is that you can’t do it, so you can rest assured.
But it’s not something that can be done now. How can you settle down with peace of mind? I think some are undermining the safe impression of Apple’s system.
Smart script
The same function as this smart script means that your handwriting has been imitated. Because for some people, handwriting itself is just as personal.
Even if it is useful in court, this iPad can already imitate your handwriting AI
In the same way, I feel that it has damaged the image of Apple as a safe and trustworthy company.
Of course, if you don’t do these functions, someone else will do it. And there will be a press conference, so everyone knows about it.
There is such a function in this world. If everyone will notice it, I think it is a good thing.
Let me show you how it works Been through pages of my home screen
This function of the iPhone can be used on the Mac
Many people think that this function may be the most useful in this conference. I also believe that countless people do it in their lives.
We all once hoped to be able to do this. I put my mobile phone aside for the second time, then walked over to pick up the mobile phone, and then I wrote a message again.
It’s quite stupid that it has become a thing. This function of iPhone mirroring has become a small problem in your life.
But I think it is very useful. It is very reasonable. The function of this notification is also when you look at a screen.
I understand that you still have to go to the second screen to see this notification. I understand that you cannot just view everything on this computer.
What’s the content of the notification I want to see? So the first function is that many people in this press conference think it is the most useful function. I think it is just right.
To be honest, if we don’t talk about AI and talk about it as an OS update, there are two or three worth mentioning.
Then there is the improvement function, but in fact, it is mainly an incremental update.
Some people say that the words have been upgraded, but they can’t feel the difference at all.
This is normal because it wants to add some new features without you feeling any difference.
If you use these things and suddenly it doesn’t look the same, it will feel weird. It’s so hard. But first of all, every time it updates, it will
aim for this is what makes your experience. It’s not a big change. It’s a function added this time.
It’s not that many. By the way, every year when the iPhone is launched, it will also launch the latest update at the same time.
This year will be like iOS18, and it will be officially launched. So what happens every year after the iPhone is released?
If you found this post insightful, please share it with someone who might benefit. Thanks for your reading. Share your thoughts, and suggestions, and help shape a better experience. If you find it inspiring, share it with your friends give it a ‘clap’ and follow. Let’s build something great together — drop your comments below!