Are you happy with Apple Intelligence?

A recent poll by SellCell, as quoted by 9to5Mac (link below) found this:

Smartphone users in general are unsatisfied with the existing AI features as the survey recorded 73% of Apple Intelligence users and 87% of Galaxy AI users stating the new features to be either ‘not very valuable’ or they ‘add little to no value’ to their smartphone experience.

Since this forum seems to be much more pro-AI then other sites I read, what is the consensus after a whole week of the newest features? Personally, it’s been all useless for me, the only thing I care about is making Siri more powerful, so I am waiting for that.

1 Like

+1


“New” Siri is not expected for at least a year, but I’m not waiting. I’m testing server based AI, and apps that I can run on my iPhone/iPad. IMO, the competition will want to be on Apple devices.

1 Like

Writing tools have been the win for me. I cancelled my grammarly because Proofread does what I need.

2 Likes

I turned on the notification summaries on my Macbook Air out of curiousity - they range from useless to bad, lol. I could see them being good for people that leave ALL notifications on and just have a firehose of info, but I keep only the most important stuff coming through.

5 Likes

My problem with proofread is that if I use it here in Safari, it’s an all or nothing change. It doesn’t give me a review of what it changed. I can either copy or replace the text. I have used it in other apps and it gave a traditional accept or cancel each change as you see in any other spelling and grammar check.

2 Likes

That is the one issue that I have with it. So I usually end up copy and pasting the paragraph into the document and reading both carefully. I have to imagine thats an update ready to happen.

I don’t think I’m typical. I already use advanced image AI editing and generation through Adobe Photoshop and Firefly, and I have a paid ChatGPT for all my writing, outlining, and now even web searches.

So I have little need for the basic AI tools Apple has released.

As others stated, I do have a Grammarly subscription, so probably should take the time to see if Apple is good enough, or some script/shortcut to a ChatGPT workflow would allow me to discontinue Grammarly, but right now I have an “ain’t broke, don’t fix it” approach.

I do relish the day that Siri becomes not just less annoying, but actually useful and next-gen capable, but that feels like chasing a mirage.

1 Like

The only value I’ve gotten is turning some inside jokes with friends into generated emoji that we all now use liberally.

5 Likes

From the article cited at the top of this thread:

It remains to be seen whether … ChatGPT integration will change that view.

President Woodrow Wilson and “Hunter DeButts” anyone?

Save yourself some embarrassment and never use ChatGPT as a defense on a national television show.

It’s a big Nothing Burger. As OP said, the real value proposition for Apple here is to make Siri actually useful and the tap into the local data (such as email, calendar, photos, etc) that an OpenAI etc can’t access.

Only my Mac is supported but I turned it all off because that was the only way I could find to disable suggested message replies. I wish they’d stop wasting resources on this and just fix some bugs.

The only problems I have with it so far:

  1. “Apple Intelligence” is like “iCloud”. Which bit?
  2. Related to #1, I wish they would say what they are delivering. The Siri demo demo from WWDC is what I want. We have “better Siri” but not that Siri. Or do we have parts of it?

The biggest problem I have overall is the tech press fawning or raging over it when it’s not a complete story. The fact it isn’t a complete story is fair game, but polling whether “it’s good enough” is kinda pointless.

Also a problem in the tech press. The “better” Siri has almost exclusively been described in terms of quotes from Apple’s press material… which is unfortunate because it’s a confusing mess. “Use natural language for Music search…” Hey! Sounds cool! “…such as specifying multiple categories”. Err, OK, is that all I can do? Because I tried a more generic query for some music and the result was a single song played and I have no idea how one led to the other.

EDIT: General rant… if any outlet cannot do anything more than reword a press release, they should just link to the press release. Don’t even do pull quotes because it’ll just dilute the message.

3 Likes

It’s a beta. There are not enough fully coherent features yet. It’s too early to make judgements. It’s not one thing anyway.

What is already obvious is that it will be integrated into the system (like most Apple features) so will be quietly there to do things when you need them, it will improve over time and it cares about privacy. Wait and see.

2 Likes

It is not. They not only released it to the world, they heavily marketed it. Every tech company is doing that though, Apple was just late to the party.

I have no doubt it will get better, I just don’t think most of it will be any more useful once it has. Again though, if it makes Siri better, that’s all that matters to me.

The problem with the tech press is all they have to report is what Apple leaks, IMO, to Mark Gurman.

2 Likes

My most used AI from Apple came out in previous years and they were referred to as machine learning.

Far more useful than summaries or writing tools is the ability to copy text from an image or copy the subject of a photo into a new file without the background. Those are super useful to me and I use them all the time for work. The fact that I can pause a youtube video which contains text on a screen, select and copy that text is bonkers and has proved useful to me many times.

As a naturalist that spends a lot of time outdoors I love learning and cataloging the plants and animals around me and the Photos app identification feature is one I use constantly. Honestly, it feels like living in Star Trek. I take a photo of something and within seconds the Photo app provides not just an identification but links to instantly cross reference with other images or learn more via Wikipedia.

As for this year’s new features, the sloppy roll-out and media coverage, I just shrug. It is what it is. I’m taking the long view. These features will add to the previous years. They will accumulate over time to form an ever better, more complete toolbox. It’s not about this year’s features, it’s about the process and the direction.

4 Likes

Good point! Though even Apple refers to those differently from the features being complained about in this thread.

Similarly, I’m very happy with the more natural language permitted for search Photos added this year.

I use this a lot too, especially with birds. When it works, it is awesome, but it fails a lot. Birds I would say it gets about 90% right. Plants it gets right about 20% of the time, usually because it doesn’t consider location data. For instance a palm tree in Florida it identified as some tree that grows in New Zealand. Some moss on a tree it told me it was lichen from the arctic. Locations it never gets right, despite the photo having GPS data. It told me a waterfront park I was at was a dock on the other side of town.

I took a picture of a sunset at a dock last month. The Photos pin for it popped up not identifying anything, just showing random pictures of docks and bridges at sunset. I am not sure what the purpose was. (See below)

I love it when it works (but I still use Merlin Bird ID because when it can’t get it, it gives me other options, unlike Photos), but there needs to be a way to clear the pin when it is wrong. Or a way to report that it is wrong. So I have a bunch of photos with bad pins.

Edit to add: one other thing I am impressed with it on. I have two torties (cats that are black/orange/beige) and it can actually tell them apart and get’s their names right. It took it awhile, but eventually it got it and now it seems to nail it every time.

Saying I’m happy with it is far too strong. It has benefited me in zero ways since it rolled out. It’s far too underdeveloped compared to the tools I use elsewhere, like Claude (which is my current favorite). Image Playground needs a lot of work, and the writing tools are something I only tried a few times before I decided I was much better than it, which admittedly isn’t a high bar looking at the software. When they make Siri better…a lot better, this answer may change.

1 Like

Only apps using the latest text engine of macOS support the ability, when proofreading, to highlight the changes and to click through them. Without those abilities proofreading using the Writing Tools is a bit of a dead loss (as others in this thread have already highlighted). However, there is a workaround which may be worthwhile for some text.

If you copy and paste the text into TextEdit you are able to click through the changes. Admittedly, they have still been made to the TextEdit text (so you probably don’t want to copy that text back over your original) but I simply make any requisite changes manually to the original text if I want those changes.

In my case, as an Alfred and Moom user, I’ve rather streamlined the process of getting the text into a sensibly placed and sized TextEdit window.

Stephen

1 Like