Has someone already started this thread? I missed it if they did.
What is everyone hoping for at WWDC this year? I’ve got some real “pie in the sky” ideas. I’d (still) like to see some major enhancements to macOS for information management. I’d like to be able to run macOS from my phone, and connect my phone to a keyboard, trackpad, and display. How about cohesive tags for everything in the system that sync across platforms? How about Spotlight and Siri integration with an on-device large-language model?
So much I’d like to see for the future of tech, but… you know, ok, maybe we’ll all strap a pair of displays to our face instead.
I’m also looking for iPadOS improvements. In addition to Stage Manager improvements, I’d like to see the lock screen widgets available on the iPhone become available on the iPad. And why limit it to 4 widgets? And speaking of limitations, I’d like the number of Focus Modes to be expanded.
I can’t see it being cheaper. The base Ultra Studio is already $4k. If it has a chip double the Ultra, the interconnect alone for something double that would be crazy (a four-way, I guess?) and it would start at 128GB RAM.
My current Mac Pro is something the Ultra could handle if I could use ARM for that stuff. It’s on the lower end (8 core, 48GB RAM, W5700X.) I mostly just want to see it exist.
The significant technical challenge is whether they can 16x the M2/M3 at a reasonable yield and engineer the connections to add-on computing outside the SoC. That’s analogous to the AirPower challenge computing and addressing overlapping zones. They also have a business challenge of developing external GPU and neural net addon partners (or making their own, of which there hasn’t been anything reported.)
As far as market size, the desire to run ML has grown demand for significant local processing and memory significantly since the Studio was released. It’s still not huge, but well beyond just reviewers and obsessives with $50k.
Multiple options for Homekit notifications (you can have either when you are not at home or at night - but not both)
Motion sensor security notifications on Apple TV.
Time Range of Automations – you can either use after sunset to sunset or between time frame. You can’t mix them (so no automating something in easy way between sunset and midnight)
More Private Airplay on Apple TV. We have only one and I want to use is as home hub.
I can put airplay option only to A) Everybody on the network or B) People I share Homekit house. But if I use everybody on the network with password protection media title and controls still shows in Home app. I care about my privacy and don’t want my household members to see what I watch (even when not airplaying) so airplay on Apple TV is useless for me and I’ve turned it off.
Call them whatever but some form of logical variables or home states in Homekit. So I could have a switch that won’t turn lights on me if I went to sleep before sunset.
Better auto switching of AirPods. If I watch something on Apple TV with headphones and answer the face time they switch after couples seconds. Even if I switched to speaker after accepting the call to make it faster they will override it.
Using state of Apple TV as automation trigger for turning on light strip when you turn on tv, turning lights dim when you pause etc.
Triggering automation when device is in state for specific amount of time ( no motion for 3 minutes, doors opened for 1 hour)
Shell or shortcut ability to set wallpaper on specific space (MacOS)
Apple TV related shortcuts from iOS on macOS
Ability for TvOS to sync iCloud for more than one user. From what I’ve read on Infuse forum they can do that only for one default user so their app can’t utilise different profiles options of Apple TV to have separate libraries and watched history.
Basically Homekit needs a big overhaul
Yes, Nvidia’s in a great position. Their customers’ applications are valuable, and the sale of data center computing is fairly pure but not commoditized in pricing. So they capture a lot of the value. It’s hard to draw conclusions when comparing to Apple’s revenue since Apple’s is an order of magnitude larger, and more diversified. iPhone revenue in particular is keeping Apple’s stock stable.
I just meant that Nvidia is an AI chip company like IBM, Google, AMD, Cerebras, etc. and investors are expecting (hoping) that they are going to profit from this new market. Most AI hardware, at least for the foreseeable future, will probably be located in data centers like those from Microsoft, Google, Amazon, and IBM.
Do you know if Apple’s Pro Workflow Team is still around? That was the team inside the building at Apple that housed its pro products group and was supposed to make sure the 2019 Mac Pro really worked for pro users.
I wonder if they think that a modular version of the Mac Studio would be enough for Apple to release as the new Mac Pro.
The M chips contain the CPU, GPU, RAM, SSD controllers, and pretty much everything else. I can’t think of anything that could be placed in a module that could match the speed of the SOC connections. What am I missing?