Improvement of Integration in 1D HNMR and CNMR

from http://chemnmr.colorado.edu/ammrl/archives/May-2002/21.html

Thank you so much for your suggestions and comments on quantitative NMR. Here is a summary of responses with names and personal notes removed.

ORIGINAL REQUEST:

Dear AMMRL,

Do any of you currently use software (or macros your lab has created) to improve integration of your 1-D spectra? What is the name of the software and who made it? How did you evaluate its performance? What is your application for quantitative NMR?

I have been doing a lot of quantitative analysis by proton NMR (standards authentication, etc.) and have written some macros to make it easier for me to quantitate the commonly seen compounds. However, I am about to start on some work that will require I do a lot better. I have wondered what the limitation of NMR is for accuracy and precision.

Also, can any of you recommend a professor or chemist who is working on improvement of NMR integration?

Any help you can give me is much appreciated.

Patrick Hays
*************

Response #1:

You might try LC-Model:
http://s-provencher.com/
The manual gives some insight to the issues concerning the types of measurments that you are trying to do.
It is not meant to do High Resolution stuff but it can be coaxed to do so.
Other wise MRUI:
http://www.mrui.uab.es
Should get it.
Things to consider when thinking about "what the limitation of NMR is for accuracy and precision":
How good are your concentrations, pH, tuning/matching, 90 degree pulse, B zero compensation, temperature control, ... is the receiver gain always the same or if not have you calibrated it over the range that you are using, dc offset, what is your signal to noise, how many averages do you need, what is the field drift, how good is your lock, in the end you are looking at the area under the curve but how good is your shim.
Hint:
Use an internal standard if possible.

********
Response #2

I would suggest you try deconvolution programs if you use Varian or Bruker instruments.
I am not sure about other instruments.
The deconvolution program is included in VNMR or UXNMR.
The program will fit the individual lineshape(Lorentzian, Gaussian or mixture of both) of peaks in the region you are interested.
Hope this info will help you.

**********
Response #3:

Won't go into this in detail; it's an area where a _lot_ of work has been done over the last 50 years.

1. Baseline correction is likely the single most important factor in removing common errors in integrals (although see below). NUTS with its FB command does a very good job of baseline correction. VNMR, Xwinnmr, Winnmr, and other reasonably sophisticated software do a good job of baseline correction, but are less straightforward than NUTS to apply. Various sophisticated baseline correction routines have been written for Felix that go well beyond what is available in packaged software, but these relate (I believe) primarily to multidimensional data.

2. Deconvolution fitting is essential to obtaining accurate peak areas when peaks are not baseline resolved. Again, NUTS and other major software provide this function. It is important to perform proper phasing and baseline corrections prior to the deconvolution fitting.

3. Even with the best deconvolution, getting sub-1% quantitation is greatly improved by 13C decoupling. Other J- or isotope shifted peaks may have to be accounted for to get the best fitting/areas.

4. Properly accounting for relaxation is essential for good quantitative work.

**********
Response #4:

I think that the accuracy of most integration is limited by not knowing the t1. (Work by a collegue:) He could integrate oil samples to within 1% which was determined by reference to a destructive quantization technique. Most people seem to get within 10%.

**********
Response #5:

I recommend the book "NMR Data Processing" by Hoch and Stern (1996, Wiley-Liss). There is a chapter on quantification, and it deals with the limitations that you are worried about.

There are some algorithms (curve-fitting) that may improve your numbers, but the usual integration in combination with a sufficient number of points per peak (at least 5, better if more), a flat baseline, and excellent signal-to-noise will do a pretty good job (you probably know that already).

**********
Response #6:

not a controlled answer but I think that the company EUROFINS has deepend the analysis down to quantifying from the FID to avoid phasing problems possibly Martin (one of the sons of the creator of this society published his math's) should appear under chemometrics

**********
Response #7:

We run quantitative analysis for our standards characterization lab. We use ACD software to process, integrate and create the report. I don't' believe that the ACD software is any more accurate that the integration routines on the Varian systems but, it is easier to create a report with
it. The integration routine on the Varian spectrometer is extremely simple, the system simply sums the intensity of all data points under the region selected. This is what you want for a quantitative integral. The only problem is that this is highly baseline dependent. There are some
other, more sophisticated methods for dealing with integration such as Bayes or deconvolution. Unfortunately, neither one of these methods handles complex, i.e., coupled, lineshapes very readily. As for accuracy of integration, we can achieve a routine accuracy of +/- 1%.

************
Response #8:

To ensure the best integrals, make sure that you:
1. Adjust alfa so that lp=0 (using the calfa macro) -- to eliminate the first source of baseline curvature, then;
2. Perform a two-point back linear prediction (after alfa is properly set)
to eliminate the second source of baseline curvature.

LATER FROM SAME RESPONDER:
If your baseline is FLAT (because alfa is set correctly and the 2nd point is back LP'ed) you will have no integral drift. (Assuming the phasing is correct as well.) A "dc" is fine and neccessary, but if you feel the need to do a "bc" (because of integral drift) you are probably not set up as well as you could be. Make sure the LP number of points (lpnupts) is twice the coefficients (lpfilt). To be complete:
parlp - creates LP parameters
dglp - displays them
proc='lp' (vs proc='ft') - turns on LP processing
lpalg='lpfft' - the default
lpopt='b' - for back prediction (vs 'f')
lpfilt=32 - the default
lpnupts=128 - make sure it is > twice the size of lpfilt
strtlp=3
lpext=2
strtext=2
(The last six vales set up two-point back linear prediction.)

To make everything come out well, the phasing, the alfa, and the linear prediction (all of which interact a bit) need to be set up properly. How you do this depends a bit on whether you are using any DSP or not. If you are using DSP, you also need to adjust the rof2 / alfa ratio (keeping the sum constant) to acheive the flattest baselines.

***********
Response #9:
sounds like you are asking about software solutions.

An interesting way to quantitate without adding internal standard to each sample is to perform signal injection using ERETIC as described at ENC. In general, the accuracy and precision of the signal injection method seemed to be higher than adding an internal standard.

Trouble is, you need a waveform generator on the channel from which you pulse 15N and the probe needs to be one that picks up cross talk between the coils, like a flow probe. traditional 5mm or 3mm probes will likely not pick it up. that also unfortuantely means your give up S/N. SO ERETIC presents a trade-off.

**********
Response #10:
The metabonomics people claimed to have improved everything in getting the numbers off the spectrum: phasing, etc. This would be Jeremy Nicholson and John Lindon, aka Metabometrix. And there is another metabonomics company, Chenomx. (Both have web sites). Software people have probably also thought long & hard about this; Acorn NMR as a mom & pop operation might be the most accessible.

There was a paper which reported achieving 1% quantitative accuracy by NMR. They had to use two technicians, two scales, and rotate between the various combinations. (Maniara et al, Anal Chem 70:4921-4928 (1998).

The clearest discussion of t1's and tip angles I found was Rabenstein & Keire's chapter (pp. 323-369) in "Modern NMR Techniques and their Applications in Chemistry", A. Popov, K. Hallenga, ed.s, Practical Spectroscopy Series, Marcel Dekker (1990).

***********
Response #11:

In our PERCH Project http://www.uku.fi/perch.html we have spent a lot of time on qNMR (quantitative NMR). In our opinion, the "deconvolution method" is the best tool for integration of NMR signals. For example, in recent experiments we have found that impurity signals in drugs corresponding to 0.1 mol% can be easily quantified, if the signals do not overlap strongly. This means that for small solvent molecules the quantification limit is below 0.01 wt%. We actively develop qNMR for a number of applications.

Our ten year experience in qHNMR has been incorporated into our PERCH NMR Software (see the above www-address). The TLS-tool of the software contains many properties (contraints, powerful base-line functions and options) that are not found in any other software, so far we know. For an example about the constraints and multiplet-structures, see Magn.Reson.Medicine, 36:359-365(1996).

Unfortunately there is no demo of the software available at this moment. A new version, with better Windows-interface, should be available in June. If you have any questions or like to test the present version please do not hesitate do contact me again.