Geneva Lake Astrophysics and STEAM logo
Young woman with dark hair, wearing a dark top and blue jeans holding a laptop with a jumble of stickers on the cover, standing to the left of a screen projection entitled "Source Extraction Issues," with a series of six astronomical images showing a bright light source slowly being reduced in steps by a light reduction program.
GLAS intern Avery Metzcar, a University of Chicago astrophysics major shows the work she did in summer 2023 on a machine learning program designed to predict optimum conditions for astronomical photography.

GLAS Intern Avery Metzcar, a University of Chicago astrophysics major, spent her entire summer internship following in the footsteps of Hubble research astronomer Dr. Amanda Pagul. Her goal was to apply the same techniques used by Dr. Pagul to improve the quality of Hubble images to ones taken with the Stone Edge Observatory telescope in Sonoma, Calif. To accomplish this, Avery did a background analysis of 319 digital astronomical images taken over 14 nights. The procedures were designed to reduce or eliminate clutter caused by scattered light, intervening light sources, distortions caused by the atmosphere, and missing data due to cloud cover. The steps Avery employed went far beyond those routinely applied to astronomical data. Avery discussed her project results during a presentation at GLAS in late July.
The project wasn’t focused on the main objects being photographed, but the incidental items also captured on the image.
Analyses of astronomical image backgrounds take into account the humidity, wind speed, atmospheric effects and obstructions (clouds), imperfections in the telescope lens and filter, angle of observation, moonlight, and the angle of the moon to the object at a given time and time of exposure, any of which can contribute negative effects on an image, Avery said. Astronomers call the light produced by the telescope’s target object the image signal. Other specks of light on the image caused by light leaks, dust or atmospheric conditions are called noise. To be seen, the signal must be brighter than the background. The goal is to eliminate or reduce as much noise as possible while preserving the signal, making the target object easier to detect and study. But Avery’s project went further than just clearing up background clutter in astronomical photographs. The premise of the project was to characterize and predict observing conditions and their effects on observation. Avery utilized a machine learning program to predict the median amount of noise in the background of an image based on the observing conditions.

It’s not an easy task.

Young woman with dadrk hair parted down the middle, with white earbuds visible, wearing an orange T-shirt, and seated at a table with her eyes cast downward and her hands working on a laptop in front of her. The laptop cover is posted with a jumble of stickers. A smartphone is next to the laptop and next to the smartphone is a waterbottle. In the background are several office chairs, a table and in the far background are the glass front doors of the GLAS Education office.
GLAS intern Avery Metzcar works on her project in the GLAS Education office during summer 2023.


In a process that seems backwards, Avery focused on the noisy background rather than the target object. She had to find a way to eliminate the signal without affecting the background clutter. The process she used required her to mask out the object that the astronomers were photographing in order to see the background noise, all of the stuff the astronomers don’t want to see in their astronomical photographs.
Since the noise in an image is highly affected by the atmosphere, Avery had to correlate weather over the Sonoma observatory with specific images taken at specific times at specific sections of the sky with specific filters. She then used several image processing programs to “stack” or overlay those background images, as many as 20 at a time, taken over several nights to extract the most complete view of the noise in the various sections of the sky.
Avery was assisted remotely by Dr. Amanda Pagul, Visiting Scholar at Fermilab National Accelerator Laboratory, and a former GLAS Education intern. Dr. Pagul had done similar work to improve and sharpen images taken by the Hubble Space Telescope.
During Avery’s presentation, Kate Meredith, GLAS director, asked if she followed a particular model in developing her project.
“I did not necessarily have a model to follow,” Avery said. “Amanda (Dr. Pagul) sent me a paper about the way that Hubble images are affected by the sun position and stuff like that,” she said. “It’s a somewhat similar methodology to what they did.”
With a project so complex, it’s understandable that it encountered some difficulties. For example, there are flaws in the data set. “Not all nights and filters had more than one image,” Avery said. “It takes more than one photo to stack images.” And the local meteorological data was incomplete. Avery said she relied on the Western Weather Group website for weather data. And while it was adequate, there was a gap “It would be great to find more about clouds and various condensation in the air,” Avery said. “Typically, airports have climate information, except the one near Sonoma doesn’t. So you can’t find the resources for it,” she said.
During her presentation, Dr. Doyal “Al” Harper, a University of Chicago astronomer emeritus who sat in on Avery’s presentation, said the Stone Edge Observatory has a cloud sensor, but it’s not calibrated. “(Dr.) Amanda (Pagul) has asked for it to be recalibrated,” Dr. Harper said.
Avery said there was also some anomalous data regarding effects of moonlight on image background. Moon angle is the angular separation between the moon and wherever the telescope is pointing. She said she expected that the closer the moon is to where the telescope is pointing she expected the highest interference. But her data didn’t show that. “Our values showed the opposite, which is weird, and I would have to dig more into that to see why,” Avery said.
Dr. Pagul later said she was not concerned about the counterintuitive moon data. “You can explain that away with other conditions causing that,” Dr. Pagul said. “If you observe an object near the Moon in R(red)-band, it’s going to matter less than if you observe it in a bluer band because the Moon transmits intensely through blue and not as much through red.”
Dr. Pagul said she was impressed with Avery’s progress on the project.
“She did a lot,” Dr. Pagul said of Avery. Her work measured how the background varied based on the position of the moon, the ambient temperature, the telescope mirror temperature, the filter used, relative humidity, and angle of observation. “If you’re observing at 30 degrees you expect to have more background,” Dr. Pagul said. “So she plotted what the average or the median background was for each image after removing the sources.”
And then Avery started a machine learning program to determine the effects of background noise on certain observations. “If you put in: this is the cloudiness, this is the phase of the Moon, this is the altitude you’re observing at; if you plug in all those values, will you know what your background is? Which is important, because if you want to observe something faint and extended, you probably want to optimize those parameters,” Dr. Pagul said.
The work isn’t done, Dr. Pagul said. But she said she hopes that Avery can come back and complete it.

Leave a Reply

Your email address will not be published. Required fields are marked *

Skip to content