With the University (and entire planet) in lockdown due to the ongoing covid 19 pandemic, now seems as good time as any to make you aware of the Bioimaging hub’s ‘rebooted’ YouTube channel.
A decade is a long time in imaging. Way back in 2009 we set up a YouTube channel to showcase the capability of our new, all-singing, all-dancing Leica SP2 confocal microscope. At the time, we uploaded a collection of short 3D animation sequences that highlighted some of our ongoing research applications. Fast forward eleven years. Whilst many of those early demo videos have been highly viewed (one over 20 thousand times) they’re starting to look rather dated, particularly when compared to the material that we’re now producing using our new-fangled confocal and lightsheet systems and high-end supporting software. As I say, a decade is a long time in imaging.
So, with the advent of spring, we thought it was high time we dusted down our YouTube channel and gave it a much-needed overhaul. As well as introducing lots of nice new image content from some of our latest 3D imaging systems, we felt that the channel would have a greater sense of purpose if we were to develop it into an educational resource for microscopy and bioimaging, with obvious relevance for remote learning (i.e. perfect in our current situation). Consequently, we have started to collate the most useful and relevant of YouTube’s microscopy-related content (webinars, tutorials, demonstrations etc), ranging from the basic principles of light microscopy to cutting-edge fluorescence-based nanoscopy techniques such as FLIM, so that they are all placed under one roof for your convenience : )
One of the things we hoped to provide our userbase was a series of video tutorials for the hub’s many microscope systems. Whilst there’s a great deal of useful training material on YouTube, in the main, it tends to be aimed at many of the high-end, turn-key imaging systems. Furthermore, not all microscopes are created equal, they each have their own peculiarities which reflect their intended function and most ‘evolve’ over time, through upgrades, to accomodate the vagaries of research. So, with this in mind, we have started to create our very own bespoke training videos for each of the hub’s microscope systems (example here).
The new training videos will supplement the standard operating procedures (SOPs) we have written for all of our imaging equipment and should provide an invaluable resource for user training. As such, they will be embedded within the appropriate sections of the hub’s SOP repository (read more here). The online video content and its associated SOP will also be viewable at the click of a mouse button via a desktop shortcut on all of our microscope-associated PCs allowing easy access during instrument operation.
If you have time on your hands, then please pop over to YouTube and take a look at how our channel is developing (link here). It’s still work in progress but, as I say, it has the potential to be a very useful resource; not only for hub users, but anyone with a passing interest in microscopy and bioimaging. Constructive feedback is welcomed.
The eagle-eyed amongst you may have noticed that the main corridor within the bioimaging hub is looking a little nattier these days. We thought it required some brightening up and so have started to adorn its walls with some nice, new A0-sized foamex prints of microscopical images that we’ve generated in-house on the hub’s microscope systems. We’ve tried to select images that showcase the beauty of the unseen microscopic world which reflect the art in science, or ‘SciArt’ as it is now known (see below ref). We hope the images stimulate interest and highlight the state-of-the-art research and research facilities within Cardiff School of Biosciences. No prizes for spotting the artistic influences on some of our ‘works’ : )
The Bioimaging research hub does a nice sideline 3D printing scale replicas of biological samples for use in science engagement and teaching. These can be made from the teeniest of microscopic samples imaged via optical sectioning microscopy (i.e. confocal or lightsheet) or from large anatomical samples imaged via 3D photogrammetry or from 3D scanning techniques. We’ve posted a few blogs on this site in the past describing 3D pollen models that we’ve made for various research groups within Cardiff University (e.g. for the ‘Footprints in time’ and ‘PharmaBees’ projects) and for a growing number of external organisations within the UK and abroad (e.g. the Met office, the Smithsonian institute etc).
Recently, we were approached by the National Botanic Gardens of Wales (NBGW) to generate 3D models of twenty different species of pollen grains identified in honey by Dr Natasha De Vere’s research group for their science outreach and engagement programme. Natasha is the head of science at the NBGW and is using cutting edge DNA bar coding technology to understand pollinator foraging preferences. This research is providing amazing insights into the selective range of plant species that important pollinating insects such as bees visit when foraging (you can read more about this fascinating work here and in the reference below).
To generate the 3D prints we first needed pollen samples from each of the respective plant species (you’d be forgiven if you thought the Nat Bot Gardens could provide these ‘off the shelf’ – nope!) Now, coming from a zoological background, my botany field skills are best described as rudimentary (and that’s putting it mildly). So, equipped with my smartphone, a plant identifier app that I downloaded from the Google Play store, and some zip-lock sample bags, I embarked upon a ‘Pokemon-go style’ palynological quest (‘gotta catch ’em all’) that took me, obsessively, to the local parks, woodlands, river and rail embankments, country lanes and coastlines (and even garden centres) of South Wales.
After some effort (and with help from my long suffering family), I managed to identify and collect all of the pollen species on the wish list. I then began to image representative grains from each species using the Hub’s Zeiss LSM880 Airyscan confocal microscope. Individual grains were optically sectioned through their volume, 3D reconstructed and then output in a file format for 3D printing on our Ultimaker 3D printer (method described in reference below).
The finished 3D pollen models are shown in the photograph above – each model is approximately 15cm in diameter (i.e. enlarged by a factor of approximately x400 relative to the original pollen grain). The models will be on display at the Growing the Future stand at this year’s Royal Welsh show (20-23rd July, 2019) and also at the Pollinator festival at the National Botanical Gardens of Wales (24-26th August, 2019).
Hawkins, J., de Vere, N., Griffith, A., Ford, C.R., Allainguillaume, J., Hegarty, M.J., Baillie, L., Adams-Groom, B. (2015) Using DNA metabarcoding to identify the floral composition of honey: a new tool for investigating honey bee foraging preferences. PLoS ONE10 (8): e0134735. https://doi.org/10.1371/journal.pone.0134735.
N.B. All of the above systems are now networked via 1GB desktop switches. Up-to-date standard operating procedures and risk assessments for each system are available through the Bioimaging hub’s online SOP repository via their desktop folders (‘Read me before use’). See a member of staff for further details.
How do you make a dragon fly (ask it nicely I suppose)? Well, this was the question we were asking ourselves a few weeks ago after an email enquiry from Dr Trevor Bailey of the National Museum of Wales. Trevor is one of the museum’s senior paleontologists and is involved in curating many of the museum’s public exhibitions and programmes involving fossils and prehistoric life (You can read more via his profile page here).
Trevor had contacted us to enquire whether we might be able to help with a forthcoming exhibition at the museum, called ‘The Fossil Swamp‘, by making a scale replica model of an extinct insect species similar in appearance to a modern dragon fly (it’s actually classed as a Griffin fly), but with one big difference (and I mean BIG): its size. How big I hear you say? Well, to give you an idea of its sizeable dimensions, the wingspan of Meganeura was approximately 0.7 metres long (i.e. roughly the same wing span of a large, adult sparrow hawk)! Indeed, this is how the insect came to earn its rather ominous sounding moniker: ‘mega-neura’ means ‘large-nerved’, referring to the network of large veins supporting the insects enormous wings (Brongniart, 1893)
In fact, Meganeura monyii is one of the largest known flying insect species ever to grace planet earth. It lived more than 300 million years ago, in the carboniferous period where the atmospheric oxygen concentration of air was much higher than that of today (around 35% then, instead of 21% now) which, it is thought, allowed the insects of that period to grow to enormous proportions (insects breathe through small holes, or ‘spiracles’ in their body walls connected to branched air tubes called ‘tracheoles’ which convey oxygen to their internal tissues). Furthermore, at the time Meganeura was buzzing about, bugging the primitive lifeforms of the day, there were no other aerial vertebrate predators around (in fact, birds arrived to the table 75 million years later) so it could pretty much act with total impunity!
So Trevor supplied us with a digital model of the insect for 3D printing, together with the desired dimensions, based on recorded fossil evidence (Brongniart, 1893). Interestingly, the digital mesh was actually created as a component of a carboniferous forest simulation by a colleague of his in Germany (link here). In order for us to 3D print Meganeura’s body to scale using our Ultimaker 3 extended 3D printer, we had to fabricate the head and thorax separately to the abdomen and then re-attach these after removal of their supporting scaffolds. We printed these using polylactic acid (PLA) filament at an intermediate print resolution. The first attempt looked okay, but the finished model was rather blocky in appearance, so we smoothed the digital mesh and then reprinted at a higher resolution with much better results.
The next challenge was those huge wings. I downloaded .png image files of the venation patterns recorded by Brogniart (1893) here. However they were simply too large to 3D print at the desired thickness (and believe us, we tried) so a different approach was necessary. Each wing was laser printed onto a separate sheet of acetate. These were then cut out and laminated – the composite structure increasing the rigidity of the wing but still allowing realistic flexion. To attach the wings to the thorax, I drilled holes through adjacent thoracic segments and fed lengths of wire through the holes to support the leading edge of each wing pair. The wire was then bonded in place to prevent any lateral displacement.
Next up was the paint job, which became a labour of love (and exercise in mindfulness) in my spare time! Now, unfortunately, no one knows what colours or patterns adorned the body surface of Meganeura as the fossil evidence is all black and white. Artist’s impressions are therefore based loosely on modern equivalents (e.g dragonflies, damsel flies etc), or have just been made up to make the insect look as fearsome as possible – it was a carnivorous predator after all! So after a few Google image searches, just to get some ideas, I finally went with a black and yellow/orange colour scheme with iridescent bronze eyes (with artistic input from Trevor and my daughters). I used acrylic paints, purchased cheaply from The Works, which gave good adhesion and cover without the necessity of a primer coat. Fine detail was added under magnified optics.
When the model was fully painted we attached the wings to the wire frames using extra strong clear sellotape before taking it over to Alexandra Gardens, opposite the School of Biosciences, for some wildlife photography, doing our best not to frighten the native fauna (or general public)!
The model will be on display as part of the Fossil Swamp exhibition in the National Museum of Wales at Cardiff from 18th May, 2019 to 17th May, 2020 along with lots of other amazing artefacts from the carboniferous period. Please go along to visit – it promises to be a fantastic family day out.
Brongniart (1893) Recherches pour servir á l’histoire des insectes fossiles des temps primaires : procédées d’une étude sur la nervation des ailes des insectes (Research to serve the history of fossil insects of the early ages : preceded by a study on the wing venation of insects).
Digital mesh of Meganeura monyii was taken from the Carboniferous forest simulation, page author and domain holder Heiko Achilles; 3D printing by Dr Pete Watson; Model painting, wing fabrication and finishing by Dr Tony Hayes; Wildlife photography by Marc Isaacs. Blog post by Dr Tony Hayes.
Hands up who’s seen the provocative Stephen Spielberg sci-fi thriller* Minority Report? In the movie, the main protagonist, chief of ‘pre-crime’ John Anderton played by Tom Cruise, investigates a future crime via a cool gesture-based holographic virtual reality (VR) interface. Whilst current VR technology isn’t quite that far into the future, it’s certainly not far off. Indeed, virtual reality is now becoming a reality in microscopy as researchers strive to improve their 3D understanding of complex biological samples. As creator of both the confocal microscope and the head-mounted display, a forerunner of the VR headset, Marvin Minsky would certainly approve of the convergence of these two technologies. The potential is enormous: imagine, for example, being able to take a virtual tour inside a tumour, to climb into an intestinal crypt or to peel apart the posterior parietal cortex – and all without getting your hands dirty!
‘Immersive microscopy’, as it is now known, is an area of imaging in which Zeiss in partnership with software developers arivis are currently leading the field (you can learn more here). To get in on the act, the Bioimaging hub at Cardiff School of Biosciences has been developing a VR application of our own for visualisation and manipulation of volume datasets generated by the hub’s various 3D imaging modalities. We anticipate that this technology will have significant relevance not only to imaging research within the school, but also to teaching and science outreach and engagement.
We’ve been using the affordable Oculus Go VR standalone headset and controller in association with the Unreal 4 games engine to create VR environments allowing interaction with our whole range of surface rendered 3D models. These range from microscopic biological samples imaged by confocal or lightsheet microscopy, such as cells or pollen grains, to large, photo-realistic anatomical models generated via photogrammetry.
As proof of principle we’ve developed a working prototype that allows users to manipulate 3D models of pollen grains in virtual space. You can see this in action in the movie above. We’re planning further developments of the system including new virtual 3D environments, different 3D models and object physics, and features such as interactive sample annotation via pop up GUIs. The great thing about VR of course is that we’re limited only by our imagination. To borrow a quote from John Lennnon, if ‘reality leaves a lot to the imagination’ then VR leaves a lot more!
*we’ll conveniently ignore his more recent VR-themed sci-fi flick ‘Ready Player One’!
Above: A screenshot of the Bioimaging Hub’s SOP repository
If you wasn’t already aware of the Bioimaging Hub’s SOP repository (N.B. there are shortcuts set up on all of the networked PCs within the facility), then please take a look at your earliest convenience. The database was set up as a wiki to provide Hub users with up to date protocols and tutorials for all of our imaging systems, experimental guidelines for sample preparation, health and safety information in a variety of multimedia formats in one convenient and easily accessible location. It’s still work in progress and we would welcome any feedback on how the resource could be further developed or improved.
One of the problems associated with imaging fluorescence in large biological samples is the obscuring effects of light scatter. Traditionally this has meant physically sectioning the material into optically-thin slices in order to visualise microscopic structure. With the advent of new volumetric imaging techniques, e.g. lightsheet microscopy, there is increasing demand for procedures that allow deeper interrogation of biological tissues. With this in mind, an innovative clearing system has recently been purchased through generous donations to the European Cancer Stem Cell Research Institute (ECSCRI). The equipment, which will be housed in ECSCRI lab space, allows large, intact histological samples to be rendered transparent for fluorescent labelling and 3D visualisation by confocal and lightsheet microscopy.
The X-Clarity tissue clearing system is designed to simplify, standardise and accelerate tissue clearing using the CLARITY technique (an acronymn for Clear Lipid-exchanged Acrylamide-hydridized Rigid Imaging/Immunostaining/in situ-hybridization-compatible Tissue hYdrogel). In the technique, preserved tissues are first embedded in a hydrogel support matrix. The lipids are then extracted via electrophoresis to create a stable, optically transparent tissue-hydrogel hybrid that permits immunofluorescent labelling and downstream 3D imaging.
The new equipment and associated reagents will have wide relevance to many areas of research in Cardiff, including deep visualisation of breast cancer tumours by Professor Matt Smalley’s research group using the Bioimaging Hub’s new lightsheet system. You can see a video here that shows the power of the CLARITY technique for high resolution 3Dvisualisation of tissue and organ structure.
Above: Maximum intensity projections of actin stress fibres (red) and microtubules (green) of an endothelial cell imaged on a Zeiss LSM880 Airyscan confocal microscope. Z-stacks were sampled via: A. Conventional confocal optics (5 minutes scan time) B. Airyscan Fast – 0.5x Nyquist sampling (30 seconds scan time) C. Airyscan Fast – 1.5x Nyquist sampling (1 minute scan time) D. Airyscan Fast – 2x Nyquist sampling (5 mins scan time).
Through generous support of Cardiff School of Biosciences, the Bioimaging Research Hub has recently upgraded its Zeiss LSM880 AIryscan confocal system for fast image acquisition via the Zeiss Fast module upgrade. The AIryscan system allows imaging at a resolution 1.7x that of conventional confocal optics (find out more here) and the new fast upgrade provides a 4x speed enhancement with improved signal to noise ratio. The technique uses beam shaping optics to elongate the excitation spot along the y axis so that it simultaneously covers four lines in a single scan. This parallelisation approach, whilst increasing acquisition speeds by a factor of four, allows high pixel dwell times to be maintained resulting in high a signal to noise ratio. You can read more about the technique below or, if you would prefer, kick back and watch this explanatory webinar courtesy of Zeiss.
How many of you can tell the difference between the brains of, say, a human, a black rhino and a Sloth bear? Nope, me neither, but apparently, when it comes to brains, it’s not just size that counts (see below). This conundrum is one of the many fab activities on offer this weekend at the National Museum of Wales annual Brain Games event funded by the Society for Neuroscience and highlighting the range of brain-related research undertaken at Cardiff University.
In the build up to the event, our very own Pete Watson in collaboration with Emma Lane (PHRMY) has been 3D printing brains from a wide variety of animal species, including human, on the Bioimaging Hub’s Ultimaker 3 extended dual colour 3D printer. However, just to make things a little more challenging, they’ve generated two sets of 3D prints: the first set of brains are anatomically correct scale models, the other set have all been 3D printed at an identical size – and it’s up to you, dear reader, to determine which brain belongs to what animal species.
Above are a small selection of the 3D printed brains that will be on display at the National Museum this Sunday, including a glow in the dark brain from…well, that would be telling wouldn’t it?!