The Chemist | Journal of the American Institute of Chemists
 
 
  TABLE OF CONTENTS
 
  EDITORIAL
 
 
 
 
 
ARTICLE #5 -
ANTIOXIDANT, CHEMICAL COMPOSITION OF SYRIAN ESSENTIAL OIL'S JUNIPERUS EXCELSA FRUIT AND LEAVES
 
ARTICLE #6 -
CHARACTERIZATION, CHEMICAL COMPOSITION AND CREAM FORMULATION FROM THE SEED BUTTER OF MANGIFERA INDICA L.
 
ARTICLE #8 -
STARCH AND CONIFERYL ALCOHOL BASED POLYMER: A STEP AHEAD TOWARDS GREEN POLYMERS
 

 



 
The Chemist Volume 95 | Number 1 printDownload (pdf)
 

Harmonization of Blood Electrolyte Concentration Results: Are Values 'Watered Fown'?

Gilford, NH 03249
 

Abstract: Analysis of physiologic specimens such as blood, urine, and cerebrospinal fluid for electrolytes, especially sodium and potassium, began more than a century ago as physician-biochemists tried objective ways of assessing the most basic aspects of physiology-water-electrolyte and acid/base balance. Of the types of specimens, whole blood and the liquid serum or plasmaI separated from the formed elementsII became the medium in which these ions were measured, first with gravimetry, later by emission/absorption spectroscopy. Each reported results as a concentration per volume of plasma (mg/dL or mEq/L). Expected or normal ranges were established, which in the case of sodium were quite narrow relative to the average (140 +/- 5mmpl/L). The advent of ion selective electrode (ISE) technology made it possible to measure them directly in whole blood with minimal specimen preparation, but ISE’s gave results consistently higher than the established norms by an amount equal to the ‘normal’ range. Since most initial evaluative blood work on a patient admitted to a hospital or clinic includes the electrolytes, physicians had become accustomed to the values and looked at the ‘high’ ISE values as ‘errors’. In fact, it was the ISE that was measuring what the body’s sensors detected and they specifically allowed differentiation in certain serious but less common disorders, a fact that was obscured by the inherent difference in values on normal subjects. Technically the matter is simple (though there are some complex nuances) – the electrolytes are dissolved in the water fraction of the plasma, not the plasma as a whole. Changing well established physician perspectives, however, was (and is) a different story. And now the rest of the story….

Key Words: Ion selective electrode (ISE)



ISerum-the liquid portion of blood after natural coagulation is complete; plasma the liquid portion resulting when coagulation has been inhibited, typically by additives in the collection vessel. We will use the term plasma to represent both in this report.

IIFormed elements-blood cells*Erythrocytes-RBCs, leucocytes-WBCs, and platelets, each metabolically active.


Introduction

Measurement of sodium and potassium in both urine and in blood has been an established practice in support of clinical decision making for more than a century. Early methodology for each measurandIV involved quantitative dilution of an aliquot of plasmaIII from a specimen of blood to precipitate proteins, followed by use of the clear diluted filtrate to quantitatively precipitate (more dilution) the selected ions, with gravimetric measurement of the precipitate-basic quantitative analysis.

The process of collection, preparation and measurement could take many hours. The measurements were physiologically informative and enabled the establishment of an expected or ‘normal’ range for healthy subjects (Sodium:135-145 mEq/L, Potassium: 3.5-4.8 mEq/L of plasma). Once it was realized that the measurements could be used to detect various imbalances of metabolism and aid in the management of renal disease, diabetes, and other clinical conditions, it became a desirable laboratory procedureV [1].

The physiologically narrow ‘normal’ range and the observed significance in the acutely ill led clinical biochemists to seek out methods of measurement that allowed a smaller specimen size and shorter processing time to get equally reliable results. Consequently, among other analytical approaches, the qualitative ‘flame’ test for dissolved ions was quantitatively adapted to the electronics becoming available in the early-mid 20th century. This methodology dramatically increased the clinical utility of sodium and potassium measurements since it was sensitive, reliable, rapid, and easy to perform. Now the tests could be performed in a matter of a few minutes on collected blood specimens of <5 mL of blood instead of 10-20mL.



IIIMeasurand: entity intended to be measured (i.e., you measure the millivoltage that resulted from the sensor that detected the pH in the buffer, which pH was due to the carbon dioxide tension – thus the measurand is carbon dioxide, in this case the kind of quantity or characteristic of the carbon dioxide being measured is in this example, tension or partial pressure).

IVEarlier still was a similar process in which ‘whole blood’, that is the mixture proteins of blood cells, and the liquid fraction was precipitated, followed by treatment to isolate the ions.

VToday, it is estimated that hyponatremia affects about 30% of the hospitalized population in the United States.


 

Flame Atomic Emission Spectroscopy in Clinical Laboratories: Method Characteristics

Two major characteristics of the flame atomic emission spectroscopy (FAES)VI analytical process as applied to blood are salient to this report. First, as with gravimetry, FAES of blood plasma, requires separation of collected blood plasma from the blood cells. Each subsequent step must be quantitative. The accurate measuring of a plasma aliquot and a quantitative dilution to both bring the light intensity of the excited electrolyte ions into a suitable range for the photodetectors and to reduce the amount of plasma protein and lipids in the specimen as processed through the flame. Second, to get quantitative results, it is necessary to compare flame emission intensity of the diluted plasma specimen with the emission intensity of a standard concentration. Standards with known concentrations in aqueous solution, were then treated/diluted the same as the plasma/urine patient specimen. This gave excellent comparison to the ‘gold standard’, gravimetry.

For several decades, this analytical approach became the norm with countless clinical and analytical reports and medical texts and external quality assurance schemes [1], being based on measurements using the technology. During this period, atomic absorption spectrometry (AA) became more commonly available but offered no practical advantage over the FAES for processing clinical specimens.

 

Method of measurement establishes reference ranges:

Both the gravimetric and FAES/AA methods use specimens of plasma that are diluted, then incorporate standards prepared in aqueous medium having Na/K values bracketing the range of Na/K seen in gravimetry-makes sense, right? These would then be diluted in the same sequence applied to the plasma specimen. Both resulted in ‘normal’ patient values of 135-145 mEq/L (mmol/L). So, everything analytically lined up with the expected, despite the interesting fact that during the same period, so-called isotonic saline for infusion was recognized as 0.9% NaCl, which gives a value of 153 mmol/L for sodium and chloride.

Ion Selective Electrodes-New Technology Arrives! Ion selective electrodes (ISEs) measure either voltage or amperage. Their selectivity comes from at least two components of their design: 1) a ‘membrane to separate the electronic components from the blood cells and plasma protein, and 2) a characteristic selectivity for the measurand to be detected. For some measurands these are combined, for others the selectivity may come from additional reactantsVII in the internal characteristics of each ‘electrode’VIII.

‘Indirect’ ISE Measurement: The ISE can be used to measure in an environment like that in which the FAES/AA measurement occurs. In that situation the separated plasma specimen(s) are diluted/treated and processed continuously. This is operationally convenient and economical for exceptionally large volume operations in which the electrolytes are just a part of an overall set of measurements for health or metabolic disorder. Without statistical quibbling, the results obtained should be the same as for flame emission or other means of measuring sodium/potassium.

Direct Measurement using ISE’s: The advent of ISE’s also allowed direct measurement of sodium and potassium (and later other electrolytes/measurands). However, it was immediately recognized that ISEs immersed in plasma got different values than the ‘standard’ (diluted) methods. Typically, ‘direct’ ISE results were about 6-8 percent higher -> ~150mEq/L for sodium. The first thought was a method error between the traditionally measured sodium and the ISE, values at least were consistent.


 VIClinical lab colloquial-flame photometry.

VIIArguably the first ISE was the carbon-dioxide electrode of Severinghaus and Bradley, which used a silastic membrane to separate the blood from the electronics, but which allowed passage of CO2 gas. The gas in turn changed the internal pH which was detected by the internal pH sensitive glass.

VIIIFor some systems these ‘electrodes’ are in appearance nothing like what one is accustomed to, but are integrated into a circuit board, include multiple sensors, flow paths for the specimen, 10 or more analytes in the space of a pinky fingernail!


Note the percent bias of the sodium (the one easiest to spot because of its magnitude). For those unfamiliar with blood plasma values, that bias corresponds identically to the mass percent of plasma proteinIX.

Blood Gases by ISE: A clinically related measurement set to the electrolytes is the group of tests called ‘blood gases. Just as electrolytes are among the most critical tests in initial evaluation of critically ill patients, so are the blood gases (some would say they can be even more critical since they relate to the patient’s oxygen status). Blood gas analysis (BGA) consists of pH (acid-base), as well as the measurement of oxygen and carbon dioxide tension (pO2 and pCO2), both essential in assessing breathing (the most basic of body functionality), are also measured using electrodes that are themselves ion or gas selective. They necessarily measure directly in whole blood.

Using ISEs for direct measurement of electrolytes on undiluted whole blood made it clearly feasible to incorporate the two sets of critical tests together on one device – the ‘enhanced’ blood gas analyzer (eBGA). Over the past quarter century, this has been the most common clinical use of the direct measurement technology using ISE’s, with the three largest manufacturers of BGA’s having ‘flagship’ models of their portfolio being the eBGA’s (the blood gas component), since it is analytically required, the electrolyte component because it was fast dependable and convenient.

Now, in addition to being able to directly measure whole blood or plasma for electrolytes, the ISE has an additional advantage. Since they measure an undiluted specimen, they sense what the body's homeostatic system senses - the activity of the ion in plasma water (pw or pw).

What had been previously measured in both the ‘gold standard’ gravimetry or using FAES/AA was a different quantity, the ion’s concentration in the liquid phase – a liquid containing a colloidal suspension of various proteins plus potentially of lipoproteins and free lipids – none of which contain significant amounts of either sodium or potassium.

Fortunately, the total protein concentration of blood serum/plasma is consistent, so for many clinical conditions it is not an issue. However, for those patients that have low or high protein levels in the plasma, if unrecognized, the induced protein-plasma water error could push the apparent sodium/potassium values outside the normal range to the extent of requiring electrolyte therapy.

An analogous situation arises with hyperlipidemia. The lipid fraction of the liquid plasma/serum contains no electrolytes. If that fraction is significant, as in diabetes, pancreatic diseases, or other lipid disorders, even hyperalimentation, an analytically accurate measure could easily be misleading.


IXThere are several other minor contributors (junction potential gradient, etc.) to the bias (N. Fogh-Andersen, Personal communication, 2024)

 

Standardizing and Reporting Blood Electrolytes

Given the dilemma of physician expectations developed over decades versus the real advantage of ISE measurement directly, there seemed to be two choices 1) keep the familiar (to physicians) concentration of the ions in the total plasma volume or 2) make a change and report the ion activity in the aqueous phase [converted by convention into concentration units (mmol/L) and familiar to electrochemists and system developers].

To change or not to change-the ‘watering down?’ It is not my point to re-argue which approach is best, but rather to cite what two internationally respected organizations [3,4] have recommended. This recommendation is: for purposes of harmonizing results for sodium and potassium, such that direct, undiluted ISE systems might be correlated to agree with the FAES/AA reference methods [5,6] or alternatively, to a reference material developed in accordance with the procedures outlined by the NCCLS/CLSI standard C29A. The processes that resulted in these conclusions and recommendations involved substantial input from an agreement by manufacturers and thought-leader-users of ISE systems as well as governmental representatives.

Further than just these ‘paper’ standards or recommendations, the National Institute for Standards and Technology (NIST) in the USA and the Health Care Technology Foundation (HECTEF) in Japan have actually developed Certified Reference Materials (CRMs) to allow more convenient harmonization of the various ISE systems, so that they can all report the same value across a range of Na/K analyzers. An enormously successful effort to make harmonization of results for these analytes is now feasible and, in fact, possible.

While there have been issues of widespread availability of the CRMs to clinical laboratories, the process for preparing reference materials as described quite clearly in NCCLS C29 - A2 has been available for manufacturers for some time – only minor changes were necessary in the description of that process as the standard went through the proposed, tentative and now the final approved level publication. Then, we should all be able to have quantitatively comparable results or Na/K using an ISE system.

 

Whole Blood vs. Plasma vs. Plasma Water:

In whole blood specimens using direct ISE’s, all the electrolytes are determined similarly, in that they exist in the water fraction of the plasma and their concentrations in that plasma water are measured in that ‘whole blood’ specimen. While water fractions vary (e.g., hyperlipidemia,
hyperproteinemia), electrolyte balance is focused on the water fraction where these electrolytes exist. The sensors involved in this ‘whole blood’ specimen measurement sense what the body’s own sensors do – the concentration in the plasma water. Those ISE-based systems using undiluted specimens of either whole blood or plasma and based on this, report a value for sodium that is ‘normalized’ for average plasma water concentration. The clinical implication of this is that in cases of lowered plasma water fraction (hyperlipidemia, hyperproteinemia) sodium (and potassium) levels are indicative of real electrolyte status rather than being artifactually low.

The conundrum comes into play when interpreting clinical results. As stated earlier, most clinicians are more accustomed to results obtained from the mass-specimen processing instrumentation (indirect ISE) and when confronted with a result from both an acute care facility (probably a direct measurement) and a routine central laboratory are more likely to first consider their usual source of electrolyte results! As we have shown, a potentially bad move, one with potentially profound consequences diagnostically and therapeutically.

 

Conclusion

Where, then, do we stand? The promise of clinical utility without confusion, especially in the Critical Care and Point-of Care areas as brought about by eBGA’s measuring electrolytes on whole blood has been realized [7]. In the cited study both the whole blood measurement and measurement of the plasma from the whole blood specimen on the same analyzer give the same results. The hyponatremia shown by the indirect (central laboratory) measurement on diluted plasma is shown for what it is – pseudohyponatremia. Any measurement using dilution without considering the proportion of plasma that cannot contain the ions such as sodium, potassium and chloride is subject to this. While the cited study references the radiometer system, because of personal knowledge, I would expect the same sort of performance for the Corning-Chiron-Bayer-Siemens BGAs. It is probably true for the other major manufacturers who participated in the development of the CLSI/NCCLS standard C29A and the NIST SRM.

On the other hand, there exist several unconfirmed reports that are less clear in methodology and especially their assumptions based on anecdotal observations. These require confirmation and more extensive study, but they were more general in nature and simply made gross comparisons of central laboratory vs. the BGA ‘correlation’ and frequency of errors/discrepancies.

This gets us to one of the critical challenges I have written [8] and spoken [9] about in the past-one of communication among all interested parties-even those that speak the same language but in different dialects such as lab-speak, nurse-speak, doctor-speak, and the most difficult dialect of all, computer-speak. We may in fact have in the results an example of either crosstalk or failure to talk about the way in which direct ISE’s can be and are measuring sodium and how they are and can be linked to a recognized NIST-SRM. The ‘problem’ was solved by C29A and the NIST SRM, but was it communicated both internally at all ISE manufacturers and externally to the end user (laboratory).

The question remains, however, as to the status of electrolytes in blood. The problem is technically solved but there may still be significant confusion as to what differences mean. If we decide based on the agreement of the experts such as IFCC, CLSI and NIST, we could easily conclude that systems have been harmonized by the, at least, two major manufacturers of BGA/Electrolyte analyzers. The ‘watering-down’ of the standardization process to get ‘harmonized’ results has worked [10]. More on the anomalies later, however.

 

References

  1. Królicka AL, Kruczkowska A, Krajewska M, Kusztal MA. Hyponatremia in infectious diseases – A literature review. Int. J. Environ. Res. Public Health, 2020, 17(15). https://doi.org/10.3390/ijerph17155320
  2. Belk W, Sunderman FW. A survey of the accuracy of chemical analyses in clinical laboratories. Arch. Pathol. Lab. Med., 1988, 112, 320-326.
  3. NCCLS. Standardization of sodium and potassium ion-selective systems to the flame photometric [reference method: Approved standard NCCLS Document C29-A2], NCCLS, Villanova, PA, 2000. (Limited specialty circulation, no longer reviewed.)
  4. Maas AHI in Clinical Chemistry, An Overview, Proceedings of the 13th International Congress of Clinical Chemistry, and the 7th European Congress of Clinical Chemistry, eds. N.C. Den Boer, C. van der Heiden, B. Leijnse, J.H.M. Souverijn, 1988, Proposed recommendations on ion­ selective electrode determination of the substance concentration of sodium, potassium and ionized calcium in serum, plasma, or whole blood, pp. 39- 62.
  5. Velapoldi RA, Paule RC, Schaeffer R, Mandel J, Moody JR in A Reference Method for the Determination of Sodium in Serum, National Bureau of Standards Special Publication, US Government Printing Office, Washington, DC, 1978, pp. 260-262.
  6. Velapoldi RA, Paule RC, Schaeffer, Mandel J, Machlan LA, Gramlich JW in A Reference Method for the Determination of Potassium in Serum, National Bureau of Standards Specia1 Publication, US Government Printing Office, Washington, DC, 1979, pp. 260-263.
  7. Vera MA, Sutphin A, Hansen L, MJ. Resolving pseudohyponatremia: Validation of plasma sodium on radiometer ABL800 blood gas analyzers for immediate reflex testing. Lab. Med., 2022, 53(5), e105-e108. https://doi.org/10.1093/labmed/lmab114
  8. Moran RF. Point-of-care vs central lab “discrepancies”: Getting the message across. J. Appl. Lab. Med., 2017, 1(5), 595-597.
  9. Moran RF. Point-of-care vs central lab “discrepancies”: Getting the message across. J. Appl. Lab. Med., 2017. Podcast.
  10. Moran RF. POC testing and reporting of sodium, and other small molecules need modified IFCC source/type designations to improve operational efficacy and for clinically accurate, unambiguous reporting from LIMS and HIS. eJIFCC, 2023, 34(4), 271-275.

 

 
 

All rights are reserved. © The AIC 2024.
The American Institute of Chemists, Inc. does not necessarily endorse any of the facts or opinions expressed in the articles, book reviews, or advertisements appearing in The Chemist.

Web site designed and developed by RHWD

The American Institute of Chemists, Inc.
315 Chestnut Street, Philadelphia, PA 19106-2702.
Phone (215) 873-8224 | Fax: (215) 629-5224
E-mail: aicoffice@TheAIC.org