IR Thermometers: Determining Blood Product Temperature on Return to Blood Bank

Learn as we explore common misconceptions about IR thermometers and surface temperature vs. core temperature

The following PathLabTalk post describes something we encounter repeatedly in blood banks – and illustrates several problematic misconceptions about determining blood product temperature on return to the blood bank. At the same time, this post demonstrates a reliable, good practice!

“Our transfusion service is looking for an infrared thermometer that we can use to determine the internal temperature of our donor units. We issue products in validated coolers to surgery, ED, and other locations and sometimes we receive the products back that have not been used. Currently, we attach a temperature indicator to the unit, but want something more accurate that is not difficult to operate, calibrate, etc….”1

First, we congratulate this PathLabTalk poster for reaching out to the blood bank community for ideas and support. She/he has realistic, everyday needs that many may identify with. In this VUEPOINT, we present some problems and misconceptions, along with this blood banker’s good practices, that are communicated in the post.

Problem #1
Infrared Thermometers (a.k.a. Infrared “Guns”) do not measure “internal temperature”

Infrared thermometers are used to measure surface temperature without contacting the product being measured. Most specifications for these devices state “non-contact surface temperature measurement.” They do not measure “core temperature” of any product, including blood. Infrared thermometers are used in many different applications ranging from food service, to residential heating/cooling, to industrial. Typically they are used for “hard to reach” areas where contact temperature measurement is difficult/impossible.

Problem #2
A one-time temperature reading of a returned blood product provides little assurance that the blood was kept at the correct storage or transport temperature over the entire time period that it was out of blood bank control.

No matter how well everyone is trained and how conscientious they are, we all know that once the blood product leaves the blood bank, anything can happen. For example, in the OR or ED there are many critical functions being performed at once, and often over long periods of time. Blood can be taken from the cooler and left on a table for hours – and then returned to the blood bank, having had time to “cool” to what may appear to be an acceptable temperature.

The precariousness of this situation is compounded when an IR thermometer is used to check SURFACE temperature when the product is returned.
• Perhaps the blood bag was placed on an OR table for 6 hours during a long procedure, but was placed back in the cooler just long enough for the SURFACE to cool to a seemingly “acceptable” temperature…?
• Is it safe to assume that the blood temperature never reached unsafe temperatures?

Problem #3
Accuracy of IR Thermometers is affected by how they are used – due to “Distance to Target” A.K.A. FOV (Field of View) or D/S Ratio (Distance to Spot)

Accuracy specifications for most IR thermometers range from +/- 1.0oC to +/- 2oC. Looking at a few models from major manufacturers, their D/S specifications are documented as: 1 meter, 1.5 meters and 2 meters – which might surprise blood bankers using IR thermometers. The distance between the device and the target (blood product) affects the reading accuracy – and also means that the dependability and repeatability of temperature readings is “user dependent” in how they use the IR thermometer from variable distances (you’ll read about this problem in the PathLabTalk post responses). If you’d like to learn more about this particular characteristic, Grainger (industrial and facilities maintenance equipment resource) has an excellent “Quick Tip” on IR thermometers.

Good Practice

Temperature indicators provide irreversible visual evidence if the blood product exceeds its specified temperature, even if the product is “re-cooled” prior to return to the blood bank.
Attaching an irreversible temperature indicator to the blood product, as done by the PathLabTalk poster, IS one way to know if the blood was maintained at correct storage/transport temperatures. When the blood product reaches the indication temperature (typically 6oC or 10oC), the indicator provides “irreversible” visual evidence of the temperature excursion – even if the blood product is “re-cooled” after being out of blood bank control.

This elicits a similar question that we raised regarding IR thermometers – do temperature indicators that adhere to the surface of the blood product provide indication of the surface temperature or core temperature?

Safe-T-Vue indicators (Temptime Corporation) are chemical indicators, and the algorithms that are used to formulate them are based on thousands of laboratory tests that incorporate the CORE blood product temperature. While the indicator may be applied to the surface, its color response is correlated to core temperature.

Why this all matters

Bacteria are very rarely transmitted during blood component transfusion, but if they are, they usually cause severe, life-threatening adverse reactions, with the mortality rate of 20-30%.2

Periodically, there are reports of incidents likely to have caused serious injury or death, that have been linked to bacterial sepsis from blood products that were dispensed for extremely long surgeries, and returned to the blood bank unused.3 Those very products could be returned to inventory after checking the temperature with an IR thermometer, and reissued to another patient. Unbeknownst to the blood bank, the blood products may have reached temperatures that allowed for contaminants to thrive. When that unit is then reissued and transfused to the next patient, the results can be catastrophic.

Kudos to the PathLabTalk poster for good practice

In summary, we understand that it’s unrealistic to assume that blood products will be handled outside of the blood bank with the same watchful eye and expertise that trained, focused blood bankers have. And there is little certainty of blood product maintenance at appropriate storage/transport temperatures, even when using a temperature sensing device to check the product on return to the blood bank.

The most sure way to know that the blood did not reach non-compliant temperatures – that could result in bacterial sepsis – is to use an irreversible temperature indicator that stays with the blood product during its entire time out of the blood bank – as the PathLabTalk post noted.

References
1. http://www.pathlabtalk.com/forum/index.php?/topic/6436-infrared-thermometer/

2. http://cdn.intechopen.com/pdfs-wm/27955.pdf

3.  http://www.jsonline.com/watchdog/watchdogreports/problems-at-hospital-lab-show-lax-regulation-hidden-mistakes-b99585186z1-330324081.html

Cooler Validation: Comparison of “Manual” Thermometer vs. “Automated” Data Logger Methods

In our March 2012 survey of over 70 blood banks, many respondents generally described cooler validation as a “pain,” characterizing it as time-consuming, frustrating and even primitive.

Most blood banks revalidate their transport coolers annually. And although it is only once a year, there never seems to be a good time or resource-efficient way to do it.


Using multiple data loggers allows more accurate temperature mapping of the cooler interior.

The three key factors we hear repeated most often are:
1. Time Efficiency (technician’s time)
2. Data Accuracy
3. Simplifying Documentation

At the SCABB/CBBS meeting last month, we entertained compelling discussions with blood bankers who have switched from manual cooler validations with thermometers, to using data loggers (electronic temperature recorders). Some of them are using the Val-A-SureTM Cooler Validation Kit.

If you’ve ever considered switching to an automated validation process, we thought it might be helpful to share what we’ve learned from blood bankers across the country. In the following table (next page) we compare the traditional “manual” thermometer method to the “automated” data logger method – and capture how it has changed their validation experiences.

This graph displays temperature of the top bag vs. the bottom bag. The data is downloaded from data loggers and printed for permanent validation documentation, eliminating handwritten and transcribed data.

We’ll be giving away a Val-A-Sure Cooler Validation Kit at AABB 2015, so if you’re interested in a “free” chance to change your cooler validation method, be sure to stop by and see us!

Jeffrey Gutkind
jeffg@temptimecorp.com

P.S. For more on Transport and Storage Coolers, check out our Tips, Helpful Ideas and AABB Standards References on www.williamlabs.com.

COMZ VUEPOINT – Cooler Validation-Comparison of Manual Thermometer vs. Automated Data Logger Methods – web version (doc. 2341)

Are all temperature indicators created equal?

When it comes to cost and performance, how do you choose?

by Jeffrey Gutkind

In today’s cost-conscious healthcare environment, our immediate reaction when making a buying decision is to minimize purchase cost.   The temperature indicators currently on the market have different costs.  And there are questions you may be asking:

  • is the “cheapest” purchase price going to save the blood bank money overall?
  • are all indicators equal in terms of performance?
  • how do you know which indicator to choose?

To answer these questions, let’s take a step back and question WHY we even use temperature indicators.

Temperature indicators for blood products were originally designed to provide assurance that blood product temperatures had not exceeded AABB temperature guidelines when the blood is out of the blood bank’s control. The temperature indicator provides proof that the blood product has been maintained at proper temperature while out of the blood bank control.

Numerous visitors to our AABB booth a few weeks ago stated that 40-50% of the blood issued from their blood banks is not used. To further illustrate the challenge, a journal article recently published in Transfusion (shared in our August 2014 VUEPOINT), described a study by a blood bank that stated how most of their blood waste was from either temperature or time (away from the blood bank) excursions, and that 70% of those losses came from blood products issued to the OR in coolers.  Temperature indicators are used by blood banks worldwide for exactly this reason – to provide assurance that the blood products at no time exceeded temperature thresholds, to help maintain blood product quality and to minimize blood waste.

So, other than cost, what matters when choosing an indicator?

Let’s circle back to our initial questions of the temperature indicator cost and the temperature indicator performance.  Since the job of a temperature indicator is to provide temperature information back to the blood bank, the indicator’s temperature ACCURACY (also referred to as “tolerance”) is critical.

As an example, of the three most popular 10o C temperature indicators on the market today, each publishes a different accuracy specification:

  • Safe-T-Vue 10  +/- 0.4 o C
  • Indicator A +/- 0.5 o C
  • Indicator B +/- 1.0 o C

How does indicator accuracy influence blood product waste?

In this illustration, you can see that a 10oC indicator with an accuracy of  +/- 1.0 may actually “trip” at 9 o C, thus falsely indicating that the temperature of the blood is out of specification.  And, as we all know, the cost of wasted blood itself far exceeds the purchase price of an indicator – and minimizing blood waste (not indicator cost) is the primary objective behind using a temperature indicator.

Using an average cost of $250.00 for a single wasted blood unit, it’s easy to calculate the potential savings of using a more accurate temperature indicator.   The cost difference in temperature indicators is minimal in comparison to the cost of one wasted unit of blood.

When comparing temperature indicators to make a buying decision, be sure to make ACCURACY comparison a key factor in your selection process.  Safe-T-Vue indicators are available in 6°C and 10°C temperature indications, both accurate within +/-0.4°C.  *

As always, we welcome your comments and feedback on the ideas presented in this VUEPOINT.

Sincerely,

Jeffrey Gutkind
jeffg@temptimecorp.com

* Refer to AABB standards for blood banks and transfusion services, 21 CFR 640.2, 21 CFR 640.4, and 21 CFR 600.15.

When should I use a probe for temperature validations?

Here are a few guidelines for validation with or without a probe, based on what you told us

With our daily giveaway contest of a Val-A-Sure Cooler Validation Kit at AABB, we had the opportunity to talk with many of you about the validation procedures in your blood banks. Hands down, the most frequently asked question was “when do I use temperature recorders with external temperature probes – and when should I use temperature recorders with built-in sensors?”  Our Advantage Kit is configured with both types, since our research showed us that it’s done both ways for a number of reasons.

When we conducted product development beta tests and interviews with your peers, we learned a lot.  Here are a few guidelines for validation with or without a probe, based on what you told us.

Using Built-In Sensor Temperature Recorders

  1. When temperature among, around or between the bags in the cooler are to be recorded
  2. When several points within the cooler need to be recorded
  3. Where the compactness of the built-in sensor makes it easier to fit into the cooler/space
  4. In a larger cooler with six or more bags and numerous areas within the cooler need to be recorded, such as near top, near bottom, sides and ends
  5. General monitoring of all sizes of refrigerators, freezers, and ovens

Using Temperature Recorders with Probes

  1. When core temperature of blood bags is to be recorded
  2. For larger coolers where specific, more pin-point locations need to be temperature-monitored
  3. General monitoring of all sizes of refrigerators, freezers, and ovens where the probe may or may not be in a liquid

The Val-A-Sure Advantage Kit is supplied with 2 TRIX-8 Recorders (built-in sensors) and 2 TREX-8 Recorders with Bag-Sealer Probes. For those of you who prefer different recorder configurations, the Custom Kit allows you to select exactly the type and quantity of temperature recorders that will work best for your needs.

The temperature recorders have a range of +85 to -40°C, and we’re learning that some labs are using them for validations beyond transport coolers.  Every recorder is supplied with calibration document identifying instruments used for calibration and their traceability to a NIST standard.

If you have any other guidelines or suggestions to share, please feel free to Comment on this post (below) and we’ll be sure to pass it on to your peers!

And don’t forget, we have validation procedure videos here on the William Labs website.

Simulating Platelets for Validations

Guidance in using an average density to simulate platelets for validations

After reading our VUEPOINT post – “Simulated Blood Products: 10% Glycerol in water may NOT be “One Size Fits All” – that presented “recipes” for simulated blood products (Red Blood Cells, Whole Blood and Plasma) – one of our VUEPOINT readers recently  posted a comment on our website. The question was about platelets, asking for the water-glycerol mixture for simulating them, just like we had done for the other blood products. Great question and we’re glad you asked!

How do we calculate an accurate mixture based on varying platelet densities?

Because of the density range of platelets, if you were striving to be highly, highly accurate, you would need to know what group the platelets fall into. Various professional papers discuss high, low and other density groups. Here is a reference from the University of Virginia School of medicine that classifies platelets into three Density Classes, with an average density for each class.

Another platelet density analysis reported “…normal platelets layered onto Percoll formed a band extending from 1.0625 g/ml to 1.0925 g/ml, with a mean platelet density of 1.0775 g/ml:…”.1

In response to our VUEPOINT reader’s inquiry, we have modified our graph and recommended water-glycerol mixture (1.066, 26%) to include a formula for platelets. This graph plots the % Glycerol (y-axis) to Density / Specific Gravity (x-axis), which reflects density, for Plasma, Whole Blood, Platelets and RBCs.

Recommended “Recipes” for simulated blood products

Based on the data presented in this VUEPOINT, we recommend that you consider using the following mixtures for blood product simulation.

Stir for a few minutes to assure a homogeneous solution. Be sure to follow any precautions supplied by the glycerol manufacturer for handling pure glycerol.

Other Sources for Platelet Density Information

For those of you who are interested in digging a little deeper into platelet density, here is a link to another reference that reports blood density determination:
Blood. 1977 Jan;49(1):71-87. Heterogeneity of human whole blood platelet subpopulations. I. Relationship between buoyant density, cell volume, and ultrastructure. Corash L, Tan H, Gralnick HR.

Please Share Your Questions and Feedback

We always appreciate questions like these that give us an opportunity to do some research and share more valuable information, with the goal of making your job a little easier if we can. Please don’t hesitate to post a COMMENT to any of our VUEPOINT articles if you have something to share, or would like to us to “dig a little deeper” for our mutual learning.
info@williamlabs.com
1-800-767-7643

1 Platelet-Density Analysis and Intraplatelet Granule Content in Young Insulin Dependent Diabetics, A. Collier, H.H. K Watson, D.M. Matthews, L. Strain, C.A. Ludlam, and D.F. Clarke, Diabetes, Vol. 35, October 1986.