HPS 66th Annual Meeting

Phoenix, Arizona
July 25th-29th 2021

Single Session

[Schedule Grid]

TPM-C2 - Instrumentation

North 224AB   15:45 - 18:00

TPM-C2.1   15:45  Standardized Geiger-Mueller Tube Testing and Characterization using a Custom Computerized Radiation Detection System CE O'Neil*, University of Michigan ; JD Noey, University of Michigan; AJ Kent, University of Michigan; ME Trager, University of Michigan; KJ Kearfott, University of Michigan

Abstract: A Geiger-Mueller (GM) system was designed for high school outreach with the plan of using antique tubes obtained from US fallout shelters, EON 6210 or 6993, because of their availability and lack of cost. Soviet tubes, STS-5 and SBM-20, have advantages over the US tubes because of their more compact size, lower operating voltages, and more convenient terminals. ANSI standard N42 3:1999(R2006), specifies several tests for determining the performance of the GM tubes that are also informative for studying overall system performance. Longitudinal sensitivity, voltage tests, deadtime, voltage plateau and hysteresis tests were performed for this purpose. Early tests revealed high noise, particularly at ~900 V required for the US tubes. This prompted a redesign of the circuit, and design restriction to the Soviet tubes. Pandemic conditions forced research to continue outside the lab, without the benefit of shielding materials, appropriate sources, lack of environmental control, and metering specified in the standard. Many tests were accomplished at home using cardboard boxes, a consumer multimeter, a ruler, a 25-y-old surplus oscilloscope, and a cellphone camera, with Fiestaware and thoriated lantern mantles as sources. This approximated some high school environments. As teaching rigorous instrument testing is part of a college laboratory course, upgrades to the experiments are underway. Appropriate point sources of Co-60 and Cs-137 and upgraded oscilloscopes were obtained. A rig was 3-D printed to enable consistently precise system and source positioning. An appropriate shield is being designed along with a small temperature-controlled enclosure. Ultimately the data from the tests will bracket the expected range of performance of legacy tubes of different types. Experiments based upon the tests will merit incorporation into a graduate laboratory class, with simplified tests with available equipment used for outreach programs.

TPM-C2.2   16:00  Understanding the Uncertainty Associated with Radiological Measurements JE Zamora*, Perma-Fix Environmental Services ; AU Lopez, Perma-Fix Environmental Services; SA Walnicki, Perma-Fix Environmental Services

Abstract: All measurements contain some degree of uncertainty regardless of the quantity being measured, or the means and methods of the measurement. The reported result, usually a number and a unit, represents the best estimate of the true value of the measurand and is recognized as such within an appropriately applied level of confidence as indicated by the uncertainty of the measurement. On the other hand, the uncertainty may be thought of as the range of possible values within which lies the true value of the measurand. Although typically omitted from everyday measurements (e.g., vehicle speed, temperature, furniture dimensions, and mass/volume of packaged goods), the reporting of uncertainty for radiological measurements is an essential component of the result as it provides an indication of the quality and confidence level of the true value. Understanding what uncertainty represents and how to properly apply it to interpret a measured result are critical concepts to make measurement-based decisions; neglecting, misunderstanding, or misapplying a measurement’s uncertainty may yield unsatisfactory results and incur unnecessary burdens on resources and schedules. The purpose of this paper is to define uncertainty and explain the importance of uncertainty associated with a measurement, with an emphasis on radiological analyses. This paper focuses on gamma-spectrometric analyses for total radium (Ra-226 & Ra-228 (Ac-228)) in Naturally Occurring Radioactive Material (NORM) or Technologically Enhanced NORM (TENORM) and uses data to emphasize different sources of uncertainty, their impacts, and possible mitigation techniques. This paper also includes a general discussion of what reasonable uncertainties should be tolerated with respect to radiological and everyday measurements.

TPM-C2.3   16:15  Quantification of Uranium in Aqueous Solution Using a Photon Counting Method RV Sistryak*, Clemson University

Abstract: A new system for detecting and quantifying uranium in dilute aqueous solution is constructed and tested in the laboratory. Each element of the system has been chosen to render the resulting system adaptable for use in the field. Uranium is captured and concentrated by passing the dilute solution of uranium under pressure through a membrane having a uranium selective ligand on its surface. A silver activated zinc sulfide (ZnS:Ag) scintillator is applied to the membrane. Visible light emitted from the ZnS:Ag scintillator in response to alpha radiation from the uranium is collected in one of two ways: with an acrylic lightguide or with an integrating sphere. Finally, the light photons collected are counted with a silicon photomultiplier. The results of detailed testing of both versions of the system (lightguide and integrating sphere) are promising. A detection rate of approximately 50% is achieved by optimizing the implementations of the elements of the system in various ways.

TPM-C2.4   16:30  Comparison of Different Simple Circuit Designs for a Raspberry Pi Based and Cell Phone Controlled Geiger-Mueller Radiation Detection System L Jautakas*, University of Michigan ; S Tawfik, University of Michigan; AJ Kent, University of Michigan; MA Cooney, University of Michigan; CE O'Neil, University of Michigan; JD Noey, University of Michigan; KJ Kearfott, University of Michigan

Abstract: Circuit design plays an important role in the accuracy and reliability of Geiger-Mueller (GM) radiation detection systems. A GM radiation detection system was designed with the special constraints of being especially simple to build, troubleshoot, and understand. The associated circuit consisted of three main elements: a voltage booster to drive the GM tube, a filtering stage to remove unnecessary noise, and an inverter to digitize the signal. The circuit connects to a Raspberry Pi computer which processes the signals and displays results on a cell phone through Bluetooth. The starting design consisted of a Raspberry Pi, basic electrical components, (resistors, capacitors, inductor, transistors) and a printed circuit board. Unfortunately, the original design suffered problems with voltage spikes that would frequently break the Raspberry Pi. Several improvements to the initial basic circuit design were considered. One of those was focusing on protecting the computer from voltage fluctuations. Solutions to this problem included using a Zener diode or operational amplifier to limit the voltage going in and out of the computer or using 3.3 V to power the inverter instead of 5 V so that the voltage coming into the computer would be acceptably small. For each of the modified designs, breadboards were used for the initial design phase followed by the creation of prototype printed circuit boards prior to adoption of the design. Tests including voltage and current measurements in the frequency and time domains as well as simplicity requirements and cost lead to the choice of powering the inverter with a lower voltage.

TPM-C2.5   16:45  Characterization of a Prototype Thermoluminescent Dosimetry System and Determination of Optimal Heating Rates for Seven Different Dosimetric Materials JH Thiesen*, University of Michigan ; CA Irvine, University of Michigan; CJ Stewart, University of Michigan; W Yu, University of Michigan; JD Noey, University of Michigan; KJ Kearfott, University of Michigan

Abstract: The Rexon UL-320-FDR, a prototype thermoluminescent dosimeter reader with removable planchets incorporating a non-contact infrared temperature measurement-feedback system, was characterized using seven dosimetric materials. One hundred 3.2 x 3.2 x 0.14 mm dosimeters (chips) each of LiF:Mg,Ti, CaF2:Dy, CaF2:Tm, CaF2:Mn, Al2O3:C, CaSO4:Dy, and LiF:Mg,Cu,P chips were used. Calibration, dose-response, and heating rate experiments were conducted for each material. A 270 MBq Cesium-137 source delivered three 15 mGy air kerma calibration doses to each chip to calibrate individual sensitivity response factors. A mid-range signal-dose linearity test was performed for each material, with ten dosimeters at each of ten different dose levels between 2.4 and 29 mGy. Glow curves were collected from all materials with heating rates between 1 and 20 °C s-1 to determine the effects on signal collection. The non-uniformity of the irradiation beam was analytically corrected using a quality control regime. Simple region of interest analysis alongside quantitative glow curve analysis was performed with automated in-house software. The data was used to compare reader performance to that of similar readers, design optimal time-temperature profiles for each material, and identify the magnitudes of various errors and uncertainties involved in the dosimetric process.

TPM-C2.6   17:00  Optimization of Data Flow Infrastructure for a Weather and Radiation Monitoring System with Different Sensor Stations Types S Tawfik*, University of Michigan ; CC Huang, University of Michigan; AJ Kent, University of Michigan; JD Noey, University of Michigan; KJ Kearfott, University of Michigan

Abstract: With a primarily educational goal, a system consisting of multiple data stations equipped with a variety of meteorological and radiation sensors is undergoing development. Both improved professional and more affordable station designs are being engineered and tested. This work focuses on the intentional design and optimization of the data flow infrastructure for the overall system. The current hardware includes sensors of different types measuring a dozen different parameters. These sensors are connected directly to Windows machines, connected to Windows machines through data loggers, or connected directly to Raspberry Pi computers. While all stations currently communicate using internal accounts and channels to the university computer network, future stations are to be sited in high schools, museums, and other venues. This necessitates careful consideration of both efficient and safe data transmission to the university. The original data flow infrastructure design evolved from a basic university storage environment, employing one server designed for data acquisition and transferring data to servers optimized for website hosting. Usage of a commercial software repository was envisioned for data from external sensor stations. Due to a discovered security risk, one of the sensors had to be immediately reconfigured to the data logger instead of directly to the Windows machine. The simplified and optimized data flow infrastructure design consists of a virtual machine containing both database and networking code. This virtual machine will be able to readily interface safely with both Windows and Raspberry Pi operating systems and secures the database credentials in a single server rather than in multiple machines. A new database was set up that allows for a single interface to access data from multiple locations. The virtual machine also improves the security of the data due to the environment’s sandboxing capabilities and complex user permission configurations.

TPM-C2.7   17:15  Evaluation of a Low-pass Filter Algorithm for the Removal of Photomultiplier Tube Impulse Noise JH Thiesen*, University of Michigan ; W Yu, University of Michigan; KJ Kearfott, University of Michigan

Abstract: Photomultiplier tubes are used in radiation protection instruments ranging from scintillator-based survey meters to optically-stimulated luminescent dosimeter readout systems. Light photons registered by photomultiplier tubes originate from light from all geometrically available processes. Therefore, photomultiplier tube output integrates signals of interest and background photons. Some background signals may produce a high enough ambient that the original signal is unrecognizable or has significant discontinuities. If this background occurs with sufficient frequency, repeating measurements becomes aimless. Thus, noise removal or reduction becomes the only possibility to ensure statistical accuracy. This work presents an algorithm for the reduction of sudden discontinuities, termed impulse noise, observed in photomultiplier tube signals. The algorithm employs a low pass filter with an adaptive bandwidth. The bandwidth is determined iteratively from the arithmetic mean of an expanding set of data. Signal replacement is accomplished through linear interpolation over a short time segment corresponding to the noise duration. With proper calibration, the algorithm can serve as the base of a robust filter capable of identifying and removing anomalous data. Presented results are generated from simulated noise of various shapes applied to a predetermined set of functions. The primary measurement reported in these simulations was percent noise removal. For all demonstrations, the high noise removal percentage indicates that the algorithm is an effective pre-processing step intended to precede traditional smoothing methods.

[back to schedule]