Skip to main content

Programming of DIY microscopes: MicroManager vs LabVIEW

In the flourishing field of DIY light microscopy, a decision of choosing the programming language to control the microscope is critically important. Modern microscopes are becoming increasingly intelligent. They orchestrate multiple devices (lasers, cameras, shutters, pockel cells) with ever increasing temporal precision, collect data semi-automatically following user-defined scenarios, adjust focus and illumination to follow the motion (or development) of a living organism.
So, the programming language must seamlessly communicate with hardware, allow devices be easily added or removed, have rich libraries for device drivers and image processing, and allow coding of good-looking and smooth GUIs for end users. This is a long list of requirements! So, what are the  options for DIY microscope programming?

There are currently two large schools of microscope programming - Labviewers and Micromanagers. (update: Matlab for microscope control also has a strong community, comparable to labviewiers and micromanagers.) Smaller camps of pythonians, C++/C-sharpers, and arduinists also thrive, but their favorite languages seem to be of less popularity.


A wiring diagram in LabVIEW 1.0, ca 1986.
LabVIEW is a ecosystem of it's own and a distinct way of thinking. It was first released by National Instruments for Macintosh computers in 1986, which probably determined it's slick design and outlandish but visually appealing dataflow paradigm. Today it dominates the world of computers (mostly Windows) used in data acquisition, instrument control, and industrial automation. It's programs are written by connecting virtual instruments (VIs) with wires (data channels) in two-dimensional, multi-layer wiring diagrams. Virtual instruments start running when all their input wires receive the data, and pass new data over output wires to the next VIs. There is a universe of device drivers, libraries and functions which come with LabVIEW and allow to program almost any imaginable way of device control with nearly arbitrary timing precision. Beautiful GUIs are an essential part of LabVIEW program (front panel).

As an ecosystem, LabVIEW can seamlessly communicate with a rich variety of National Instruments hardware, such as multi-functional IO boards for programmable analog/digital input-output. It does not mean, of course, that you have to use NI's hardware - LabVIEW drivers are supplied by all major producers of cameras, lasers, and other microscopy devices. LabView can also communicate with Arduino (via USB port).

Like Macintosh computers, the beauty is pricey - LabVIEW Full development license is about $3000, but cheaper Base version is at $400/year as subscription.

It can also be painstaking to learn LabVIEW dataflow programming and build correct wiring diagrams, even for seasoned coders with experience in text-based languages. You don't write the code in LabVIEW - you select elements (virtual instruments), drop them into the wiring diagram, and connect with wires.
 Even a simple for() loop appears as a confusing frame, which can puzzle a person who used to program in C/C++. I first gave up learning LabVIEW, was traumatized for several years, and only after patiently taking online courses for two weeks, I finally started to code sensible programs without paralyzing panic "why doesn't it work again!". I now enjoy and appreciate LabVIEW, it became my favorite programming language.


MicroManager is an open-source plugin for ImageJ, developed at UCSF since 2006 by a team of seasoned microscopists. As such, it is designed specifically to control DIY microscopes. MM is free, easy to use and program, and comes with a full power of ImageJ microscopy plugins. It includes device drivers from dozens of manufacturers - for cameras, lasers, shutters, etc. The GUI control is an easy start, but you can dive deeper into MM programming by writing Beanshell scripts and Java plugins, similar to programming ImageJ.

The benefit of MM being free open-source software has a flip side - it is developed by enthusiasts sponsored by unstable money supply from grants - as the NIH funding terminated in 2015, MM depends on money from subscription plans.

MM has drivers for National Instruments multifunctional IO boards, but their functionality is currently minimal. This makes MM unable to generate, for example, analog output (AO) waveforms to control some popular devices like galvanometric mirrors (galvos).
Galvos are computer-controlled turning mirrors for laser positioning - their angular motion is optically converted into translational motion of laser beam in focal plane. They take analog input voltage -10..+10 V as a position command, and can move to a new position within 1 millisecond. Galvos are essential components of some confocal and most two-photon and light-sheet microscopes.

Hybrid programming

The analog output (AO) waveform generation is easy in LabVIEW, but camera control and image processing are more user-friendly in MicroManager. How to choose between the two?

It is possible to use the best functionality from both environments. Camera can be run from MM and configured to send digital trigger every time it takes an image. LabVIEW can run a program on National Instruments DAQ board, which listens to the digital trigger and generates AO waveforms after every trigger. These waveforms control galvo scanning and laser power in a tight synchrony with the camera. The LabVIEW program can be like this:

The upper portion of the wiring diagram defines the shapes of two waveforms, and the lower part starts AO generation task and configures it to be repeatedly triggered, until user presses Stop button on the front panel.

Did you know you can simply drag the png image of a code snippet into LabVIEW, and it will convert it into working code? Wild!  

So, the program above listens to a digital trigger coming at a terminal named PFI0 of a National Instruments board PCIe-6321 (in my case), and after every trigger it generates two analog waveforms, a ramp and a square (ok, a trapezoid), at the output terminals ao0 and ao1, respectively. Like so:

This can occur less than 10 ms - camera sends a trigger, laser turns on, while the galvo mirror rotates and scans the laser - a useful sequence for generating a digitally scanned light sheet.
Download the LabView code

Which is 'better'?

Micromanager is easier for beginners and free. If you want to start quick - use MM. If at some point you need more growth - consider LabVIEW for some heavy-lifting, such as analog waveform generation or real-time processing.
Keep your code simple, comment it, and share. Happy coding!

Update: There is also a significant community using Matlab, which was not mentioned in the original post. My apologies! Matlab provides interface for National Instruments IO boards and other hardware, and there is a popular software package ScanImage for microscope control, written exclusively in Matlab. My personal limited experience with Matlab for instrument control made this slip away from my attention.


  1. This comment has been removed by a blog administrator.


Post a Comment

Popular posts from this blog

Programming NI DAQmx board in Python: easier than you think!

For my DIY microscope I had a task - generate a train of digital pulses which simulate camera trigger, so that other devices (galvo and laser) are synched. I wanted to do it in Python, so that it seamlessly integrates in my data acquisition and analysis Jupyter notebook.

After some quick search I found a PyDAQmx library which seemed mature and had good examples to begin with.

Installation was smooth: download, unzip, open Anaconda prompt,
python install

After only 30 min fiddling, I was able to solve my problem in just a few lines of code:

Holy crap, it just works, out of the box. Oscilloscope shows nice digital pulses every 100 ms, each 1 ms long. The code is much shorter and cleaner than would be in C, C#, or LabView.

PyDAQmx appears to be a full-power wrapper around native NI DAQmx drivers (yes, they need to be installed), so presumably it can do all that can be done in C or even LabView (this statement needs to be tested).

One can use PyDAQmx to control galvos with fast ana…

Shack-Hartmann sensor resolution - how much is good?

If you are new to adaptive optics (AO) like me, the selection of right hardware can be daunting. Starting with a wavefront sensor - they range in price, resolution, and many options which are not obvious. By practical trial and error I learned something about resolution, which wasn't obvious to me a year ago.

The Shack-Hartmann wavefront sensor (WFS) is essentially a camera with a lenslet array instead of an objective.
 There are sensors with 15x15 lenses, 30x30 and higher. Naively, you might think "the more the better" - we are digital age kids used to get high-res for cheap.

However, there is a catch. High-res sensor, say, 30x30 lenslets, divides your photon count by 900 per spot. Roughly speaking, when you image a fluorescent bead (or another point source) by a camera with "normal lens" (not a lenslet array), and your peak intensity is 2000, this makes a very nice, high SNR bead image. However, is you switch to Fourier (pupil) plane and image the wavefront u…