My pattern for awhile now is to upgrade every other year, so I’m getting two years of improvements. One of the things from last year, but new to me is the camera control. I was using the action button to open the camera on the 15 pro. So I’ve now changed that to silence. It’s funny how 2 years of using the action button, I still sometimes try to open the camera that way, forgetting about the camera control. I set the swiping on the camera control to change the zoom level. I’m still getting used to it and it seems to take a few swipes before it registers, not sure how this is supposed to work.
I like the fact that it is easy to change from photo to video, I think this is a good change and what I do most often. I’m fine with all the other modes being hidden behind another tap. It’s easy enough to just start scrolling and get to Portrait, Pano, etc.
I spent this past Tuesday walking around NYC, this was the day Trump was at the UN and there was tons of security and protestors, etc. Consequently, I wasn’t able to get very close to the UN.
This shot is from the same spot at 8x zoom. I’m amazed at how much detail the camera has captured from so far away. You can see the antennas and lights on the top of the building. Also the pattern of the colored stones is very clear.
I heard a whirring sound overhead and saw a drone high up in the sky. I used the 8x zoom to capture this photo of it.
Another thing I’m loving is Visual Intelligence. I know this isn’t new, but I didn’t really use it until now. I walked around NYC and was using it to try to identify the different buildings. Sometimes it was able to, other times it would just give a generic description of the building because it didn’t know what it was.
Badly. Apple are trying to replicate some of the controls of a dedicated camera, but on a dedicated camera there are multiple buttons, and dials, to do these functions.
I still use a DSLR and one of my criteria when purchasing a new one is the number of buttons and dials on it. Because swiping around on screens, even touching screens, just isn’t fast enough. Apple have you pressing, pressing harder, swiping, possibly doing all three at once, inadvertently, and then they overload the functions on it, too.
How do you find the 8x with regards to camera shake when shooting handheld? The longer the zoom, the more I find a need for some sort of firm support, like a tripod or leaning against a wall.
Yours look great btw, so curious to learn if you used any supports, or if this was shot by impressively steady hands.
I read a review where the reviewer said stabilisation was a challenge at 200mm.
But thinking about it, it should be no worse than at 100mm, given the one is a central crop of the other.
This is still, however, one of the big challenges of holding a small, valuable slab of glass and metal in your fingertips. Although some modern cameras are starting to go this way, traditional cameras were designed with grips. In the case of reflex cameras, or mirrorless with EVFs, you also have the viewfinder to press against your face as a further point of steadying. With longer focal lengths it’s still a problem with all this, so there are techniques for further steadying
The one technique I can think of that would work with a phone is to breathe deeply and slowly, then press the shutter at the point of complete exhale, with a brief hold of your breath at that point.
Another option with the larger camera systems is the (imho) much easier adjustment of shutter speed. The old rule of thumb is to always be at or above your focal length, so minimum 1/200th for a handheld 200mm lens, for instance. Not sure if this still translates to smaller formats like a phone lens and sensor combo, but it gives you an idea.
One upside to shooting with the phone though, is that tripods / grips / mounts can be safely made way smaller and easily fit in a small bag.
Not to mention in-body image stabilization, which more and more cameras have these days. I have a new camera that is so insanely stabilized that I’ve gotten off hand-held, low-light shots at 1/30 of a second with no visible shake.
To the best of my knowledge, iPhones have OIS—optical image stabilization—but not IBIS—in-body image stabilization. The former counteracts camera shake by moving elements within the lens. The latter counteracts camera shake by moving the camera’s sensor.
There are plusses and minuses to both systems, situations in which one is likely to be more effective than the other, and situations in which you might opt for neither. Here’s a basic explainer.
Ahh, thanks. I’m familiar with the mechanics of IBIS. All I knew about any other system was that it was built into lenses. I didn’t know a name for it. Gotta love another industry with standardised names using basic English words that could mean many things.
But wait! There’s more! We also have EIS—electronic image stabilization. EIS uses algorithmically based software techniques to reduce or eliminate the effects of camera shake on the image. It’s most common in smartphones and action cameras. And … there’s HIS—hybrid image stabilization—which, you guessed it, combines OIS and EIS.
And … a correction: I did some research and it turns out that some iPhones (but maybe not all?) use sensor-shift stabilization, which is IBIS. Apple has labeled its stabilization system “sensor-shift optical image stabilization,” which kind of muddies the waters. Does that mean its cameras use both OIS and IBIS? Anyway, there’s some version of “sensor-shift optical image stabilization” in all of this year’s iPhones, including the Air.
I say “some version” because the 17 Pro and Pro Max use “second‑generation sensor‑shift optical image stabilization” in the main camera and “3D sensor‑shift optical image stabilization and autofocus” in the telephoto camera.