Lab 1.1: Source Extraction and Differential Photometry#
Due: Wednesday, April 23 by 5:00 pm.
In this lab, we will create a catalogue of stars using our coadded images of M35 and M3. By the end of this lab, you will be able to
use
septo estimate background and extract sourcesperform aperture photometry, and describe under what conditions it is reliable
correct zero-point offsets using stars of known magnitude
create and interpret a color-magnitude diagram
You should submit the results of the first section for both M35 and M3 on both telescopes. Probably the simplest way is to complete the code for one object and telescope and re-run it for the others. You will likely have to play with different settings for the second object. We recommend starting with M35 on the 0.7m.
# If using Colab, mount your Google Drive to access data in the shared folder
from google.colab import drive
drive.mount('/content/drive')
# If using Colab, need to install the reproject and sep package
%pip install reproject
%pip install sep
%pip install astroquery
# Import packages!
# numpy is a core package for numerical computing in Python, mostly it does fast array operations
import numpy as np
# matplotlib is a common plotting library (plotly is another good one)
import matplotlib.pyplot as plt
from matplotlib.patches import Ellipse
%matplotlib inline
# astropy is useful for reading FITS files, doing coordinate transformations, converting units, and much more
import astropy
from astropy.io import fits
from astropy.wcs import WCS # World Coordinate System
from astropy.coordinates import SkyCoord
import astropy.units as u
from astroquery.simbad import Simbad
# reproject allows us to reproject images onto the same coordinates before stacking
from reproject import reproject_interp
import sep
# glob and os are useful for navigating your file system
import glob
import os
# miscellaneous
from tqdm.notebook import tqdm
import warnings
from collections import defaultdict
# TODO: use code from previous labs to calibrate and coadd all your images
# You might find it useful to make a script which does all the steps for convenience
# You should have 8 images
coadd_filenames = {
'0.7m': {
'm35': {
'B': ,
'V': ,
},
'm3': {
'B': ,
'V': ,
},
},
'24in' : {
'm35': {
'B':
'V':
},
'm3': {
'B':
'V':
}
}
}
Source Extraction#
We will be using the package sep (the Python version of SExtractor). Documentation for this package can be found here. Most of the steps below are done in their tutorial, so check that out for more explanation. Note that unlike the tutorial, we are only using the sep background to detect point sources, not to measure fluxes. We will be using an annulus around each star to estimate that background for that star.
# Load data
data = {}
headers = {}
for filter, filename in coadd_filenames['0.7m']['m3'].items():
with fits.open(filename) as hdul:
data[filter] = hdul[0].data.astype(hdul[0].data.dtype.newbyteorder('='))
headers[filter] = hdul[0].header
# Subtract background
source_data = # TODO: choose which is the best filter to use for source extraction
mask = # TODO: mask the nan values
background = # TODO: use sep to estimate the background (hint: don't forget to use mask, try varyin `bw` and `bh` for M3)
subtracted_data = # TODO: subtract the background for purposes of source detection
# Extract sources
sigma_thresh = # TODO: find a good confidence threshold
sources = # TODO: use sep to extract sources (read documentation, don't forget to set err and mask, also try varying deblend_cont for M3 in particular)
# Exclude sources which are not stars by setting a maximum area, you probably only need to do this for M3
max_area = # TODO: find a good maximum area threshold
sources = sources[sources['a'] * sources['b'] * np.pi < max_area] # TODO: select sources smaller than the area threshold
print(f'Detected {len(sources)} sources')
# Plot extracted sources on top of image
fig, ax = plt.subplots(dpi=100)
ax.imshow(subtracted_data, vmin=0, vmax=10000, cmap='gray_r')
for s in sources:
e = Ellipse((s['x'], s['y']), 8*s['a'], 8*s['b'], angle=np.rad2deg(s['theta']), edgecolor='red', facecolor='none', linewidth=0.2)
ax.add_patch(e)
plt.axis('off')
plt.show()
Question: Does sep do a good job extracting sources for M35? For M3? Do you notice any issues?
Answer:
# Find the flux (in counts/second) within an aperture around each star
fluxes = {}
magnitudes = {}
r1 = # TODO: choose appropriate radius for the aperture
r2 = # TODO: choose appropriate radius for the background annulus
for filter in ['B', 'V']:
counts = # TODO: sum the pixels corresponding to each star (hint: check out ``sep.sum_circle``, and don't forget about the mask argument!)
inner_area = # TODO: calculate the inner area
background_counts = # TODO: sum the pixels corresponding to each star's background (hint: check out ``sep.sum_circann``)
outer_area = # TODO: calculate the outer area
background_counts_per_pixel = # TODO:
fluxes[filter] = # TODO: calculate background-subtracted fluxes in units of counts per second
magnitudes[filter] = # TODO: convert fluxes to instrumental magnitudes (don't worry about the zero-point here)
Question: Should the aperture radius for each star be the same or different? Why? What is causing the stars to have nonzero size?
Answer:
Question: What is causing the background flux? Can you think of any cases when the aperture method above might fail? Are there any checks you might want to do to your data as a result?
Answer:
Differential Photometry#
# Find the sky coordinates in RA, Dec for each source
wcs = WCS(headers['V']) # TODO: make a WCS object using the right header
ra, dec = # TODO: use all_pix2world to convert pixel coordinates to RA and DEC
coords = SkyCoord(ra=ra * u.deg, dec=dec * u.deg)
# Choose the brightest sources to calibrate against the catalog
n_brightest = 20
brightest_sources = # TODO: get the indices of the brightest stars in V band
# Query the field for stars from the SIMBAD catalog
Simbad.add_votable_fields('B', 'V')
center = # TODO: choose the center `SkyCoord` for your field
radius = # TODO: set field radius
table = Simbad.query_region(center, radius=radius)
table = table[table['V'] < ] # restrict to stars brighter than some magnitude
catalog_coords = SkyCoord(ra=table['ra'], dec=table['dec'])
# Match sources to the catalog
idx, d2d, d3d = coords[brightest_sources].match_to_catalog_sky(catalog_coords)
close_enough = d2d < # TODO: get only matched stars which are within a small angular distance of each other
print(f'{np.sum(close_enough)}/{n_brightest} matched within threshold')
# Check if the matched stars look correct, if not change the `close_enough` threshold
plt.scatter(coords[brightest_sources][close_enough].ra, coords[brightest_sources][close_enough].dec, s=16, label='sources')
plt.scatter(catalog_coords.ra, catalog_coords.dec, s=4, label='catalog')
plt.legend()
plt.show()
# TODO: calculate the zero-point offsets for the comparison stars
zero_point =
# Only use stars which are a good match
for k in zero_point.keys():
zero_point[k] = zero_point[k][close_enough]
# Check that there are no obvious outliers in the offsets
plt.hist(zero_point['B'], histtype='step', bins=10, label='B')
plt.hist(zero_point['V'], histtype='step', bins=10, label='V')
plt.gca().set(xlabel='Calculated zero-point', ylabel='Number of stars')
plt.legend()
plt.show()
Question: What is this histogram telling you? Based on the distribution, can we use the average to correct our instrumental magnitudes?
Answer:
# TODO: Calibrate the magnitudes using the mean zero-point offset of the reference stars
calibrated_magnitudes =
# TODO: make a color-magnitude diagram, with the appropriate axes (hint: look up color-magnitude diagram)
Question: What physical properties of stars do color and magnitude correspond to? Do the diagrams match your expectations for M35? For M3? Do the diagrams from each telescope look the same? (Hint: Monday’s lecture will be helpful for answering this question.)
Answer:
Acknowledgements#
TODO: Please write a short team contribution statement together explaining what each member did for the lab!