Research microscope controlled by iPad Pro

Research microscope controlled by iPad Pro

I recently acquired a new fluorescence microscope for my lab that is controlled entirely by an iPad Pro. The iPad runs an app (downloaded from the regular App Store no less) that controls everything on the scope: high-power LEDs for transmitted light, multiple LEDs for excitation using different wavelengths, focusing motors, and a high-resolution, high-speed CMOS camera. All of this control happens over the standard USB-C connector OR by joining the Wi-Fi access point that is integrated into the microscope’s motherboard. Obviously the motherboard is doing the low-level control of the microscope hardware, but the iPad is displaying a near-live view from the camera and all image acquisition happens on the iPad. The software also controls the stage motors to capture a z-stack, automatically changing the focal plane through the specimen and capturing up to 4 channels as it goes.

I’m not surprised that the iPad hardware is up to the task of capturing and storing these large images, or even that it can keep up with the file I/O and screen refresh rates needed to achieve smooth real-time viewing and focusing as a digital viewfinder. I’m also not surprised it can handle the image processing demands to do the digital haze removal in software. I am more impressed by the deep integration the company achieved between the microscope control hardware and the iPad. This is a great example of what the “Pro” level hardware is capable of in the hands of a talented developer sufficiently motivated to adopt the iPad into their system. The iPad is really the perfect form factor to control a microscope, too – so much better than needing to interact with a mouse, keyboard, and a separate monitor off to the side, which is how most microscopes are controlled. Not only is it better at this job, it’s also cheaper than a separate computer and a similarly-specced high-quality monitor.

Is anyone aware of other examples of the integration of iPad in this way? I’d be interested to learn about other examples if you all know of them. I know that much of our conversation around the iPad has centered on the user-facing features Apple implements and the degree to which those designs enable users to adopt iPad for their tasks, as well as the feature parity between Mac and iPad versions of apps. This microscope is an example of how the OS-level frameworks that Apple has created for developers allow for a wide array of applications to be implemented, but I’m not too familiar with other such examples.

I’m looking forward to training my students on this new scope and starting data collection in the next few weeks, I’ll share updates here as we start to do “real” work on it. If I understand the community sentiment here, I’ll either be in a world of pain or heaven on earth once I open the Files app… :grinning:

10 Likes

There’s a whole new category of telescopes for amateur astronomy in the last couple of years (“smart scopes”) that integrate a telescope, camera and the means to focus and point at things. Instead of looking through an eyepiece you have an app on your tablet or phone that shows what images are being taken and “stacks” lots of exposures to reduce noise and increase clarity. Not just the iPad, but it’s a surprisingly good form factor and approach for what used to be painfully complicated (think dozens of usb cables snaking around a telescope and mount, typically running back to a PC laptop running some esoteric collection of software to separately control pointing, alignment and photography.

Google SeeStar50 (the most popular one) or Vaonis (that is widely reported to be following Apple’s design and maybe other business practices!)

3 Likes

Cool!! This demonstrates a design where the instrument developers are also instrument users. They have not sub-contracted out the control and analysis software to Geeks tethered in the LabView or Windows world. I can imagine the questions at the outset …

  • What is the best platform to manipulate images in an intuitive way when viewing … an iPad (not a desktop monitor + mouse + …)
  • What is the best platform to edit images in an intuitive way when changing aspects such as contrast, brightness, … an iPad (not a desktop monitor + mouse + …)
  • What is the best platform to provide a user with a crisp form factor and input layout for controls to change aspects of a hardware instrument … a touch-tablet device (not a UI with mouse-clickable buttons on a monitor)
  • Can we split the hardware and software effectively between the instrument and the control input using an iPad … yes
  • Can the iPad communicate rapidly enough with the device to provide real-time experience for the user … yes

→ decision complete

I’d be curious on the split between what portion of the image processing is done on the iPad versus on the instrument itself. But otherwise yes, if today’s iPad can stream 4k movies at 60fps with no lag, it should certainly be able to communicate rapidly enough with microscope camera hardware to give real-time level experiences.

As for using the Files app (and perhaps iCloud) to share and/or distribute image files, I recommend setting up a dedicated drop location for the microscope files using a robust cloud server. Within my university, we use Google Drive, and I see that one of the ads for the microscope allows setting up such external drop-to locations (they show using DropBox). Distinguish between three levels … sharing images out for off-line processing, backing up the entire set of system images, and archiving sets of images, for example from users who are no longer active. On our multi-user instruments, I set up top level folders for each user using a (LastName FirstName) convention. I insist that all users put their files in their own directory and otherwise leave how they store their files to them. Defining the procedures for this administration at the outset will save your peace of mind later when you go cleaning up on image file clutter, especially on an instrument that I imaging will output lots (and lots) of large image files.

sidebar … One of my “hobbies” is developing software for science / engineering analysis, more recently image processing. I know that fluorescence microscope images are in a world all their own compared to my base level needs. Should you be interested in something beyond the ImageJ level to create what I call “publication ready” image layouts, feel free to contact me off-line.

–
JJW

3 Likes

The telescope market is an interesting example, with obvious overlap in functionality and requirements as the microscopy world. The integration is maybe a little less specific for the iPad, with some (most?) of these able to run with either Android or iOS mobile devices, including phones or tablets. These are much more consumer or prosumer oriented compared with a research microscope, too. Great example though, thanks for pointing it out!

This is a really insightful analysis of the needs and design decisions that probably went into the incorporation of the iPad – brilliant! Based on their marketing materials at least, the company does seem keen to bring a more disruptive approach to the microscopy market, which is frankly ripe for disruption.

I also really appreciate your input on how to set up a shared cloud storage provider. I spent some time today developing a workflow for my students to screen some reporter lines we’re making, and after just an hour of captures it is clear that a good system for file management and backups is going to be important. The microscope accepts USB disks for exporting files, presumably acting as a pass-through from the iPad. I might dabble with inserting the OWC Thunderbolt hub between the iPad and microscope to see if I can use the mass storage support on the iPad, which would support more kinds of devices than the port on the microscope. But I have also already linked a Dropbox folder for export, per your suggestion.

Thanks, too, for your generous offer of support – I will use the DM function here on the forum to contact you at some point.

1 Like