Proteomics Blog

Read through my latest blog posts and feel free to comment on them if you like.



Latest Posts

Whilst being desperate to get back into the lab and continuing with my research, I can’t say I’m looking forward to the time when the LC system and mass spec are first switched back on. Since we still have a little time before that is likely to happen, I thought it might be worth jotting down a few ideas so that I have a plan at least for Day One – Please feel free to comment and add any further suggestions below (all constructive comments are VERY welcome)….

Firstly, as our mass spec (triple quad) is no longer under vacuum, a source clean seems like a good place to start. No need to rush this, a few extra hours is neither here nor there. Plus, due to social distancing, there should be plenty of bench space to spread out on.

While the source is in bits and undergoing a clean (sonication etc), time to check the nitrogen supply. Let’s get some clean, dry nitrogen pumping through the system – assuming the generator turns on, that is. Think I’ll disconnect the nitrogen line from the instrument for the first few minutes just in case there is anything in the line which may cause a problem. Over cautious? Maybe, but let’s not tempt fate.

As well as the N2 the LC will need some TLC. Fresh solvents throughout is an obvious first step. What about piston seals? It’s possible that these may have become stuck to the pistons. Careful removal, inspection and lubrication seems prudent. Next, starting with low flow and high organic, switch on the pumps, gradually building up both flow rate and aqueous content. Finally, it’s a case of running multiple blank gradients over the column (to waste) before trying some system suitability standards. Keep a close eye on the back-pressure here just in case!

Mass spectrometer and LC performance need to be checked independently if possible. We use the supplied calibrant solution to infuse directly into the MS to decouple the LC. This can be compared to historical data to determine performance once the masses have been brought back into calibration.

For LC performance, we have a freezer full of six protein mix digests (all from the same stock) for checking sensitivity and peak shape on a weekly basis, so we need to compare performance to that prior to the shutdown.

Finally, a serum digest sample, which is representative of our real samples is used as one final check. All being well we should now be ready to begin analysing live samples as before.


Good luck everyone.


Ten years ago, it would have been hard to imagine a Core Facility MS lab without access to a QTOF. Proteomics had become well established and researchers wanted to know what?, how much?, how modified? and what with?, amongst other biological questions. A combination of a Triple Quad and QTOF enabled most of these questions to be answered or at least attempted with some degree of success. Indeed, back in 2004, our lab opted for a combination of QSTAR and QTRAP to fulfil the group’s proteomic needs.


But what of today? How far away from this situation are we now?


Well, whilst the unique configuration of the QTRAP (triple quad with linear ion trap) is

still going strong – albeit several generations later – the QSTAR is sadly no more.


So, what does a core lab do to fill this gap?


The (relatively) new kid on the block, is of course the QExactive with orbitrap detection. The mass of marketing hype could make the less experienced mass spectrometrist feel that QTOFs have had their day and are no longer fit for purpose. That, perhaps, technology had overtaken them and other configurations were the state-of-the-art way to go. However, this assumption does not stand up to close scrutiny. Nanoflow columns become overloaded at around 1 µg of digest, with 500 ng being a sweet spot for most complex digests. With this amount of material, a super-fast QTOF, such as the TripleTOF 6600, can acquire data at 50 Hz, ie 100 ms Full Scan plus 50 x 20 ms MS/MS scans.


How good is this in the context of other option?


Orbitrap (OT) analysers rely on the generation of an image current for ion detection. Without the critical number of ions (ion flux), no image current is generated and therefore no signal detected. The detection effectively falls off a cliff when sufficient ion numbers are not present. In recent years, OT detectors with higher fields have facilitated a faster scanning capability. However, the utility of these faster scanning routines is severely limited because their efficient employment necessitates shorter ion fill times that often render an inadequate image current with the resulting spectra devoid of information (as mentioned above). There exists therefore an apparent trade-off between speed of acquisition and MS/MS signal. The headline rates of scanning at 40 Hz seem attractive until you interrogate the MS2 spectra to find you have acquired large numbers of spectra, empty of analytical value.


What about the unrivalled resolution?

In addition to the increased number of “empty” spectra, shorter ion fill times result in a significant fall in resolution. At 40 Hz, a resolution at 200 m/z (the best marketing figure to quote) is 7.5k i.e. not dissimilar to our old QSTAR from 2004, and unlike a QTOF, this resolution falls with increasing m/z values. A TripleTOF can maintain its 40,000 resolution across the entire spectral mass range with no drop-off. And that’s in both MS and MS/MS modes.


Let’s return to the original premise that all Core Facility Labs should have access to a fast QTOF. How versatile are they?


In short, they are remarkably versatile. Data-Dependent Acquisition (DDA) of complex mixtures at 50 Hz speaks for itself, but that’s only a small part of the picture. SWATH-enabled Data-Independent Acquisition (DIA) is rapidly becoming a challenger to DDA for the in-depth analysis complex samples. The fast-scanning QTOF provides the perfect platform for such analysis strategies with a large number of SWATH windows being possible on the chromatographic timeframe.


Anything else?


Well, whilst SRMs on a triple quadrupole instrument provide a highly sensitive method of detection/quantification of lower abundance peptides, method set-up, especially for large numbers of transition is very time-intensive. Running a TripleTOF in MRMHR mode offers much easier method set-up along with tremendous selectivity due to the 40,000 resolution of product ions across the mass range.


This begs the question “Can a Core Facility Lab really do without a fast QTOF”?


Having the fast-scanning QTOF along with SWATH-enabled DIA fulfils two major requirements of a core facility lab – flexibility and robustness. Analysis of complex samples (microflow LC with DIA or DDA), gel bands such as pull-downs (nanoflow LC with DIA, DDA or MRMHR), or enriched phospho samples are all covered by the same analyser. This well-established platform has evolved over many generations to become THE essential mass spectrometer for proteomics analysis. Although the TripleTOF is a distant relative of the QSTAR the major advancements in hardware, software and methods of operation mean that it is now capable of things only dreamed of only a decade ago.

PRMs or SRMs - You decide.

Posted on 27th April, 2020

Informal chat comparing PRMs on TripleTOF to SRMs using a triple quad


On the first Tuesday of each month, two proteomics analysts, Pete and Dud, meet up over a few pints to discuss life, the universe and everything. This invariably includes a discussion about mass spectrometry such as this recent chat last month…….

Pete: Good to see you again Dud. 

Dud: You too mate. How’s work going?

Pete: Really good thanks. I’m using a very sensitive triple quad to do some peptide quant work on plasma samples at the moment. Tough matrix, but a great instrument. You just can’t beat a triple in SRM mode for absolute quant work.

Dud: Well, that’s one opinion. I’m sure most people would agree with you. But guess what….

Pete: Don’t tell me. You disagree??

Dud: Afraid so mate. Triples are great and have traditionally been the mass spec of choice for this type of work, but you need to get up to speed with what other options there are.

Pete: Such as?

Dud: Well, have you heard of Parallel Reaction Monitoring, or PRMs ? High-end Q-TOFs like Sciex’s 6600 are more than a match for the triples nowadays.

Pete: OK, I’m listening. But you’ll have your work cut out to convince me to switch from SRMs on a triple. I need to monitor three transitions per peptide, three peptides per protein and I’m looking at twenty proteins simultaneously (all times two as I have the corresponding heavy labelled standards too). That’s a total of 360 separate transitions in a really complex background. Ideally I want this doing in a 30min separation space. What do you think of that?

Dud: Certainly sounds challenging. But what if you could simultaneously monitor ALL of the transitions for each of your peptides with high mass accuracy. In a complex background such as plasma, your selectivity rockets up.

Pete: OK, you’ve got my interest. Selectivity is a major requirement, but I’m looking at very low abundance proteins so sensitivity is key. You can’t touch a triple for sensitivity can you? What do you propose?

Dud: Whilst it is true that the absolute limit of detection is superior with a fully optimised SRM on a triple, you would be surprised how close a high resolution PRM gets to a good old fashioned SRM, without all method development overhead the SRM involves. In most ‘real world’ situations, if your SRM can detect it, the PRM can too, at 10% of the effort. Moreover, the selectivity advantage of multiple transitions (and remember you don’t need to decide which ones you are going to use because you have them all at no additional cost) boosts confidence and reduces false discoveries.

Pete: The selectivity advantage is certainly very appealing. But if I tune each of my transitions to optimal collision energies and voltage settings, then I’m confident that I can’t do any better than that with other instrument formats. With spiked-in heavy-labelled peptides to confirm the elution profile, my selectivity is not exactly shoddy either.

Dud: True, but you said you wanted to monitor 360 separate transitions. How long is that going to take you to set-up? And what do you do if you decide to go after more peptides next time? Your set-up time becomes pretty unjustifiable – especially if you are then only going to analyse a limited number of samples.

Pete: OK. So how long does it take you to set-up the PRM analysis?

Dud: That’s the beauty of method development for PRMs. I don’t have to pre-select the transitions before I acquire the data. They all come for free on a Q-TOF. Just need the precursor mass and a retention time window. One chromatographic run with my synthetic peptides and I’m away. After the data has been acquired I can pull-out my transitions using extracted ion chromatograms (XIC) – with whatever XIC window I like. The tighter the window, the higher the selectivity. Pretty important if plasma is your matrix!

Pete: Mmmm, maybe there is something in this PRM business of interest after all. I guess it’s the usually case of horse for courses in mass spectrometry.

Dud: Come down to the lab and have a bash. By the way, have you got any triple time I could bag, I have this one really tricky low abundance peptide I need to find………………do you want another pint?




Despite having consumed a number of pints of ale, Pete and Dud have presented a number of points relating to merits of PRMs on a TripleTOF versus SRMs on a Triple Quad. Now why not have your say?


Who do you agree with the most, PETE or DUD?



Please add to the discussion by completing the COMMENTS box.

DDA vs DIA - A debate

Posted on 13th April, 2020

Food for thought........A debate on data acquisition approaches.


“This house proposes that with the introduction and rapid take-up of SWATH data acquisition, DDA will not survive for five more years.”



It was once said that a week is a long time in politics, but what constitutes a long time in mass spectrometry? Five years? Recent mass spectrometry instrument development has progressed at a startling rate, and with it, better and more ingenious ways of using the instruments have emerged. Faster and faster acquisition rates coupled with huge sensitivity improvements have enabled record numbers of peptides and proteins to be identified and quantified. Currently, the vast majority of approaches to proteomic analysis hinge on Data-Dependent Acquisition (DDA), however, I intend to demonstrate that this domination is rapidly coming to an end.

Data-Independent Acquisition (DIA), and in particular SWATH, is revolutionizing proteomics and is set to bury DDA for good. Gone will be the days of missed peptide fragmentation due to the stochastic nature of DDA sampling. Instead, since every ion entering the collision cell is subjected to fragmentation, post-acquisition processing has the potential to identify many more of the lower abundance peptides often “invisible” to DDA.

Current software approaches are addressing the difficulty of achieving this efficiently, however, within a relatively short time frame, I predict that this will be solved and with it the demise of DDA will be complete. Five years is a long time in proteomics – too long for DDA to survive.



As was rightly pointed out above, DDA is the basis for the vast majority of proteomics experiments globally today. Even though DIA has been around, in one form or another, for over a decade, it has still not been possible to convince most practitioners of its superiority and therefore ability to supersede DDA. In previous times, the early QTOFs were only capable of acquiring one full scan and maybe three or four ms/ms scans per cycle. However, the modern super-fast TripleTOF (6600) can now handle acquisition speeds up to 50 Hz – with high quality product ion spectra from meaningful (sub microgram) amounts of digest on column. Unlike instruments which rely on image current for detection, a modern, fast QTOF does not generate “empty” spectra at full throttle.

At present, DDA on TripleTOFs is capable of generating far more successful peptide ids than an instrument operating in DIA acquisition mode. The limitations of DDA are not, as suggested, missed triggering of MS/MS due to spectral saturation, but they are a consequence of ionisation suppression in the source. This suppression phenomenon is equally detrimental to DIA.

I therefore contest that, far from being dead in five years’ time, DDA will remain as dominant as it is today, provided that the recent advances in QTOF speed and sensitivity continue a pace.



There is no disputing the fact that DDA has facilitated enormous advances in proteomics over the past twenty years or so. However, it is has an inherent weakness in that a precursor ion will either trigger MS/MS fragmentation or be missed completely. User-defined thresholds must be set globally in order to enable triggering. If set too low, fragmentation may occur at an inappropriate point in the chromatographic cycle. Optimal fragmentation should be achieved at or around the peak apex which is not easy to achieve with any degree of consistency using DDA methods. A “snap-shot” of product ions is produced with only one or two MS/MS spectra per precursor possible, whereas MS/MS data is obtained across the entire chromatographic peak with a DIA acquisition.

In addition, triplicate injections of the same sample clearly demonstrate that lower abundance precursor ions are often not present in all three data sets acquired via DDA. I would concede that fast and sensitive acquisitions do, to some extent, mitigate for this, and that acquisitions of sub-microgram digest loadings, at 50 Hz cycling work very well on a TripleTOF. However, such acquisition speeds are unrealistic for other classes of instrument, which often require the chromatography column to be overloaded in order to enable the fastest acquisition rates to be utilized.

Possibly the greatest strength of DIA is having the ability to return to the captured data at a later stage for re-interrogation in the knowledge that all fragment ions will have been recorded multiple times. A missed peptide is missed forever in a DDA run, whereas DIA has the potential to reveal it, using ion extraction, post-acquisition. The ability of DIA to generate a comprehensive digital map for present and future interrogation is clearly a massive advantage.



Well established proteomics software packages, such as Mascot and Protein Pilot, are used worldwide to great effect and form the basis of most DDA data analysis. The challenges associated with DIA data analysis are immense and at present no single pipeline is able to extract all of the information held within a DIA dataset. Research labs and Core Facilities do not have the luxury of waiting for such, long term, developments to take place.

Current, off-the-shelf, DDA methods allow very fast method set-up and sample-to-useful data turnaround times. A deep-penetrance DIA experiment is only as good as it’s library. This can take an extraordinary amount of time and effort to generate and often involves off-line fractionation and multiple reacquisitions (in DDA I might add) in order to obtain a “comprehensive” library. DDA, on the other hand, utilises well established proven algorithms with no preconceptions as to what proteins may or may not be present. This is to say nothing of modifications.

Five years is quite simply not long enough for the DIA school to get its act together, and so I can say with absolute confidence that DDA is here for the long game, and will still be the dominant mode of proteomics data acquisition throughout the 2020s.


Concluding remarks - FOR

The dinosaurs amongst us will cling on to the familiar rather than embrace the latest advances. But just as the dinosaurs died out some sixty five million years ago, in five years’ time it will be the turn of data-dependent acquisitions. DIA data is effectively future-proofed in that it can be re-interrogated at a later stage for “future” peptides of interest. With DDA, you get what you get and the rest is lost forever - R.I.P. DDA, it’s been good knowing you, but your time is now over!


Concluding remarks - AGAINST

The death of DDA in a mere five years is pie-in-the sky. Increased Q-TOF acquisition speeds coupled with the well-established bioinformatics infrastructure along with the multiplexing capabilities of isobaric tagging reagents, ensure that there will be a dominant space for DDA analyses at least for the foreseeable future. DIA simply isn’t mature enough to supersede DDA, and won’t be for many years to come. Nice try, but DDA work and they work well. So if it ain’t broke, don’t fix it. DIA has its place, but that has to be in conjunction with DDA and not instead of it.





You have now heard a number of arguments in favour of, and opposed to the premise that, “With the introduction and rapid take-up of SWATH data acquisition, DDA with not survive for five more years.”


Now have your say.         


Please VOTE either FOR or AGAINST and if you would like to contribute to this month’s debate, please add a COMMENT on this Blog.