How to Settle a Geoengineering Bet
Unusual levels of Aluminum and other materials have been measured in rain and snow samples, but did it come from jet planes or some other source?
Geoengineering, sometimes referred to by its misnomer “chemtrails” is not what I call an entry-level conspiracy. Unlike many other conspiracies in the health care harms, natural remedies and environmental toxins cluster of suspicious scenarios, there is no simple personal benefit to be gained by making a lifestyle choice aimed at testing this conspiracy hypothesis directly with one’s own experience. Most of us have no access to jet plane mechanical systems. There are (almost) no insider whistleblowers that we might listen to and form opinions about their veracity.
The bulk of the evidence relating to geoengineering comes from lab testing of snow and rain samples, collected after observing jet trails in the sky, which have measurable elevated levels of aluminum, barium and other contaminants. Although it’s generally agreed the measured levels are higher than normal, there is still some disagreement as to whether the jet contrails themselves were the source of the contaminants or if some other ground or airborne source caused the observed materials to become airborne and then mix together with precipitation before falling back to Earth.
One technique for determining the mix of chemical compounds in smokestack emissions, smoke clouds from fires and other aerosol pollutant “clouds” is optical spectroscopy.
The basic idea of optical spectroscopy is that different chemical compounds absorb and/or emit different frequencies of light because of the resonance between their molecular structures and photon energies. If one has a strong nearly white (all frequencies) light source (such as sunlight) and can reflect it off of an aerosol pollutant (smoke cloud, billowing smokestack or jet contrail) one can, in theory, with the help of an optical spectrometer attached to a spotting telescope look for tell-tale absorption lines (and maybe even emission lines) in the spectrum of the received light and, with a little luck, identify the compounds present in the aerosol sample.
One company that produces digital optical spectrometers is Ocean Optics1 These spectrometers can produce digital files describing the intensities and frequencies of light sampled.
With regards to the next steps, I have very little personal knowledge or experience to go on aside from some amateur astronomy and physics student exercises many years ago, so let’s ask my AI friend DeepSeek.
Given the trailheads in DeepSeek’s response below, it appears there may be several paths to getting the kinds of answers we’re looking for from the combined spectrometers and software resources mentioned. We should even be able to take multiple samples of the same contrail as it dissipates and watch how the spectral signatures associated with the various compounds lose intensity as the contrail dissipates, thus establishing a correlation of the compounds with the contrail itself rather than just background widely pre-dispersed compounds that would not correlate strongly with the shape of the contrail.
Who else is interested in pulling together the technical, financial and equipment resources to make this kind of scientific determination as to the content of geoengineering trails a reality? (Please comment below)
DeepSeek’s Guidance Related to Spectral Analysis Software:
Q: I have a digital output file from an Ocean Optics optical spectrometer and I want to analyze it to determine what aerosol pollutant compounds were in the smoke cloud sampled using reflected sunlight. What software or other tools can I use to perform such analysis?
A: Analyzing a digital output file from an Ocean Optics optical spectrometer to determine aerosol pollutant compounds in a smoke cloud using reflected sunlight involves several steps, including spectral data processing, identification of spectral features, and comparison with reference spectra. Here are some tools and software you can use for this analysis:
1. Ocean Optics SpectraSuite
Purpose: This is the native software provided by Ocean Optics for capturing and analyzing spectral data.
Features: It allows you to visualize the spectral data, perform basic processing (e.g., smoothing, baseline correction), and export data for further analysis.
Use Case: Use SpectraSuite to initially inspect the spectral data and ensure it is clean and properly calibrated.
2. Python with Scientific Libraries
Libraries:
NumPy
,SciPy
,Matplotlib
,Pandas
,Scikit-learn
Purpose: Python is highly versatile and can be used for advanced data analysis, including spectral data processing and machine learning.
Features:
NumPy/SciPy: For numerical operations and signal processing (e.g., smoothing, Fourier transforms).
Matplotlib: For plotting and visualizing spectral data.
Pandas: For data manipulation and handling.
Scikit-learn: For machine learning tasks, such as clustering or classification of spectral data.
Use Case: Write custom scripts to process the spectral data, identify peaks, and compare them with known reference spectra of aerosol pollutants.
3. MATLAB
Purpose: MATLAB is a powerful tool for numerical computing and data analysis.
Features: It has built-in functions for signal processing, curve fitting, and spectral analysis.
Use Case: Use MATLAB to perform advanced spectral analysis, including peak fitting, baseline correction, and comparison with reference spectra.
4. GRAMS/AI (Thermo Fisher Scientific)
Purpose: GRAMS/AI is a comprehensive software suite for spectral analysis.
Features: It includes tools for spectral data processing, chemometrics, and multivariate analysis.
Use Case: Use GRAMS/AI for in-depth spectral analysis, including identifying and quantifying aerosol pollutants based on their spectral signatures.
5. ENVI (Harris Geospatial Solutions)
Purpose: ENVI is a software for analyzing and processing geospatial imagery, including hyperspectral data.
Features: It includes tools for spectral analysis, including spectral library matching and classification.
Use Case: Use ENVI if your data is part of a larger geospatial dataset, or if you need to perform advanced spectral classification.
6. Spectral Libraries
NIST Chemistry WebBook: Provides reference spectra for many compounds.
HITRAN: A database of high-resolution spectroscopic data, useful for atmospheric gases.
USGS Spectral Library: Contains reflectance spectra of various materials, including some aerosols.
Use Case: Compare your spectral data with these libraries to identify specific aerosol pollutants.
7. OpenSpectra
Purpose: Open-source software for spectral data analysis.
Features: It provides tools for spectral data processing, visualization, and analysis.
Use Case: Use OpenSpectra for basic to intermediate spectral analysis tasks.
8. R with Chemometrics Packages
Packages:
chemometrics
,hyperSpec
Purpose: R is a statistical computing language with packages specifically designed for chemometrics and spectral analysis.
Features: These packages offer tools for multivariate analysis, classification, and regression.
Use Case: Use R for statistical analysis and modeling of spectral data.
Steps for Analysis:
Data Preprocessing: Smooth the data, remove noise, and correct the baseline.
Peak Identification: Identify peaks in the spectrum that correspond to specific compounds.
Spectral Matching: Compare the identified peaks with reference spectra from spectral libraries.
Quantitative Analysis: If possible, quantify the concentration of identified pollutants using calibration curves or multivariate analysis.
Visualization: Plot the results for interpretation and reporting.
Example Workflow in Python:
import numpy as np
import matplotlib.pyplot as plt
from scipy.signal import find_peaks
from scipy.optimize import curve_fit
# Load your spectral data
data = np.loadtxt('spectral_data.txt')
wavelengths = data[:, 0]
intensities = data[:, 1]
# Smooth the data (optional)
from scipy.ndimage import gaussian_filter
intensities_smoothed = gaussian_filter(intensities, sigma=2)
# Find peaks
peaks, _ = find_peaks(intensities_smoothed, height=0.1)
# Plot the spectrum with identified peaks
plt.plot(wavelengths, intensities_smoothed)
plt.plot(wavelengths[peaks], intensities_smoothed[peaks], "x")
plt.xlabel('Wavelength (nm)')
plt.ylabel('Intensity')
plt.show()
# Compare with reference spectra (pseudo-code)
reference_spectra = load_reference_spectra()
for peak in peaks:
match = find_closest_match(wavelengths[peak], reference_spectra)
print(f"Peak at {wavelengths[peak]} nm matches with {match}")
Miscellaneous Resources:
https://x.com/RealAlexJones/status/1887538575681401049
geoengineeringwatch.org
Kristen Meghan
UV optical spectrometry for monitor cargo shipping emissions: https://www.laserfocusworld.com/test-measurement/article/14298715/uv-spectroscopy-becomes-a-powerhouse-for-emissions-monitoring
Ocean optics. https://www.oceanoptics.com/spectrometers/#
If you know a lab that would conduct this research, I'd be interested in hiring them. I think that analyzing the spectra is not the hard part. Absorption lines of barium and aluminum are well-known.