A Python package for National Park Service Natural Sounds and Night Skies Division (NSNSD) staff to process and analyze all-sky imagery acquired with the CCD Camera System (Dan Duriscoe et al. 2007) to assess light pollution impacts on parks' night sky resources.
- Required Software and Install Procedures
- Preparing Data For Processing
- Running the Pipeline
- Processing Flow Chart
- Image Processing Module Documentation
- Natural Sky Modeling Module Documentation
- Sky Brightness Metrics Module Documentation
- ArcGIS Pro 3.5 or later
- Adobe Photoshop
- conda (recommended to use Miniforge for new installs)
- Python packages:
arcpy astropy matplotlib scikit-image photutils numpy pandas scipy pillow dbfread
ArcGIS Pro comes with a pre-built conda environment called arcgispro-py3 that is setup to use arcpy. The easiest way to get arcpy functionality into a new conda environment is to clone the arcgispro-py3 environment and then install additional packages as needed. The following steps provide setup instructions for a Windows machine:
- Add path to arcgispro-py3 environment to your conda configuration file:
conda config --add envs_dirs “C:\Program Files\ArcGIS\Pro\bin\Python\envs”
- Clone arcgispro-py3 to create a new environment named ccd:
conda create --clone arcgispro-py3 --name ccd
- Copy the pinned file from the arcgispro-py3 environment to the ccd environment:
cp "C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\conda-meta\pinned" "C:\Users\<YourUsername>\AppData\Local\miniforge3\envs\ccd\conda-meta"
- Install other necessary packages into ccd environment:
conda activate ccdconda install astropy scikit-image photutilspip install dbfread
Below are a few common issues that have been encountered during installation or during initial attempts at executing the processing pipeline:
- SSL Certificate Errors during package installation or when downloading online content:
- Try installing the pip-system-certs package, using the
--certargument if needed to pass in a certificate file:pip install pip-system-certspip install pip-system-certs --cert="path/to/CertFile.cer"
- Try installing the pip-system-certs package, using the
- lxml DLL error when running arcpy commands:
- Uninstall and re-install the lxml package, making note of which version (e.g. 5.1.0) of lxml is being used:
pip uninstall lxmlpip install lxml==5.1.0
- Uninstall and re-install the lxml package, making note of which version (e.g. 5.1.0) of lxml is being used:
The directory structure assumed by the pipeline looks as follows, with a description of the primary contents of each folder to the right:
CCD
└─── Data -Contains the filelist.xlsx processing list
│ └─── calibdata -Calibrated images and data validation outputs
│ └─── fielddata -Raw images and calibration files
│ └─── graphics -Final all-sky skyglow images per data set
│ └─── griddata -ArcGIS grids and mosaics per data set
│ └─── maps -ArcGIS map templates for producing graphics
│ └─── rasters -Master ArcGIS grids for natural sky modeling
│ │ └─── scratch_fullres -ArcGIS workspace for full-resolution mosaics
│ │ └─── scratch_galactic -ArcGIS workspace for Galactic model mosaics
│ │ └─── scratch_median -ArcGIS workspace for median-filtered mosaics
│ │ └─── scratch_metrics -ArcGIS workspace for claculating skyglow metrics
│ │ └─── scratch_natsky -ArcGIS workspace for natural sky mosaics
│ │ └─── scratch_zodiacal -ArcGIS workspace for Zodiacal model mosaics
│ └─── spreadsheets -Spreadsheet templates and some star catalogs
│ └─── standards -Standard star catalogs for extinction/zeropoint fitting
│ └─── tables -Summary tables with metadata and light pollution metrics
│
└─── Images
│ └─── Linearity Curves -Master linearity curves per CCD camera
│ └─── Master -Master Flat/Bias/Thermal images per CCD camera
│
└─── Scripts -Houses the pipeline scripts contained in this repository
Raw data that will be processed by the pipeline lives in the CCD --> Data --> fielddata directory, where each night of data should be separated into individual sub-folders named using the 4-letter park code and UTC date of data collection (e.g. ROMO241004 for data collected from Rocky Mountain NP on 2024 October 4th). Within a night's data folder will be additional sub-folders, one per data set collected:
CCD
└─── Data
└─── fielddata
└─── ROMO241004
└─── 1st
└─── 2nd
└─── 3rd
└─── ...
- Setup up the CCD directory tree as shown above. You can download the zipped CCD Folder which contains the proper directory structure, though it is empty of any data, calibration, or ArcGIS grid files you will need to run the pipeline. If you wish to run the pipeline, please contact the Natural Sounds and Night Skies Division to inquire about obtaining the necessary calibration and grid files.
- Within the
CCDdirectory, clone this repository into a folder namedScripts:
cd "\path\to\CCD"
git clone https://github.com/zvanderbosch/nightskies_fork.git Scripts
- Ensure raw data is placed in the
fielddatadirectory. - In the
filepath.pyscript, make sure thebaseparameter points to the location of theCCDdirectory on your local machine. - In the
filepath.pyscript, update theapikeyparameter with your own Astrometry.net API key. This will be needed for image plate solving. - Modify the
filelist.xlsxfile, which should be located in theCCD --> Datadirectory. This file tells the pipeline which data sets are going to be processed. An examplefilelist.xlsxfile is provided here, and has the following fields:Dataset: Name of data night to process (e.g. ROMO241004)Process: Yes or No, whether to process this datasetV_band: Yes or No, whether to process V-band imagesB_band: Yes or No, whether to process B-band imagesFlat_V: Name of master flat file used to calibrate V-band imagesFlat_B: Name of master flat file used to calibrate B-band imagesCurve: Name of linearity response curve file used to calibrated imagesZeropoint: The default zeropoint (mag) for the CCD camera usedProcessor: Name of data processor with first initial and last name (e.g. J Doe)Central_AZ: Azimuth coordinate to place at the center of final panoramic graphicsLocation: Descriptive park name (e.g. Rocky Mountain NP)
Pipeline operation is broken down into four main steps:
process_images.py: Calibrate raw images, solve image coordinates, stitch images together to create all-sky mosaics, and generate corresponding Galactic and Zodiacal model mosaics.Adobe Photoshop: Create the terrain mask.naturalsky.py: Generate the combined natural sky model and subtract it from the observed all-sky mosaics to create an anthropogenic skyglow mosaic.process_metrics.py: Calculate complete set of sky brightness metrics and generate final output images.
Assuming you are working from a command line interface, such as Windows Terminal or Powershell, and have completed the conda environment setup and data preparation steps above, an example pipeline processing workflow for a dataset named ROMO241004 would look like the following:
conda activate ccd cd "\path\to\CCD\Scripts" python process_images.pyUse Excel to open the
calibreport.xlsxfile saved inCCD --> Data --> calibdata --> ROMO241004. Manually edit the DATA QUALITY, BORTLE CLASS, ZLM, and NARRATIVE fields with approriate values based on visual obserations from the data collection site, and save changes to the file.Use Adobe Photoshop to edit the
mask.tiffile saved inCCD --> Data --> griddata --> ROMO241004 --> mask, refining the horizon boundary and setting all above-horizon pixels as white and all below-horizon pixels as black.python naturalsky.py ROMO241004 1 V --airglowzenith=45The
naturalsky.pyscript will generate an Excel filenatsky_model_params.xlsxlocated in theCCD --> Data --> calibdata --> ROMO241004folder. Edit the Quality Flag and Notes columns for each data set, indicating the quality of the natural sky model subtraction on a scale of 0 (bad) to 5 (excellent) and using the Notes column to describe any problems encountered.python process_metrics.pyThe
process_metrics.pyscript will save a summary of the data processing results into an Excel file namedROMO241004.xlsx, located in theCCD --> Data --> tablesfolder. A few data fields, listed below, require manual entry and/or checking to make sure values are correct:NIGHT METADATA Sheet: └─── CAMERA -Camera type └─── LENS -Lens type └─── FILTER -Filter ID code └─── INSTRUMENT -Instrument name └─── ZLM -Zenith limiting magnitude (from calibreport.xlsx) └─── BORTLE -Bortle class (from calibreport.xlsx) └─── SQM -SQM measured value └─── OBS_1 - OBS_4 -Observer names └─── NARRATIVE -Visual observations narrative (from calibreport.xlsx) SET METADATA Sheet: └─── GLARE -Glare quality rating (0-5) └─── ATMOSPHERE -Atmospheric quality rating (0-5) └─── COLLECTION -Data collection quality rating (0-5) └─── PROCESSING -Data processing quality rating (0-5) └─── REFERENCE -Reference data set for the night? (Y or N) └─── USEABLE -Is data set useable? (Y or N) └─── CLOUDS -Cloud cover (0-100%) └─── PLUMES -Volcanic plume coverage (0-100%) └─── PCT20 -20-micron particle count (ppm) └─── COLLECTION_NOTES -Data collection notes CALIBRATION Sheet: └─── BAD_FRAMES -Number of bad images └─── CALIB_NOTES -Calibration quality notes
The process_images.py and naturalsky.py scripts both have command line arguments available, summarized below:
Optional
--reduce-only # Only execute the image reduction step (reduce)
--register-only # Only execute the image plate solving step (register).
--skip-reduce # Skip the reduce step and execute all other steps.
--coord-steps: # Only execute steps that affect image coordinates (register, pointing, coordinates)
--mosaics-only # Only execute mosaic generation steps (fullmosaic, medianmosaic, galactic, zodiacal).
--use-night-cals: # Use dark/bias frames from night data collection for image reduction
--use-existing-astrometry # Use existing astrometric solutions if available.
--use-domain-clip: # Use full image extent rather than rectangle clipping for mosaicsRequired
DATANIGHT # (Name of the night to process, e.g. ROMO241004)
DATASET # (Data set number to process, e.g. 1, 2, 3, etc.)
FILTER # (Filter to process, V or B, though only V currently works)Optional
--elevation # (Site elevation [m], Default = FITS header value)
--extcoeff # (Extinction coefficient [mag/airmass], Default = Best-fit value)
--airglowzenith # (Zenight Airglow [nL], Default = 20)
--airglowheight # (Height of emitting airglow layer [km], Default = 90)
--airglowext # (Airglow extinction factor, Default = 0.6)
--adlfactor # (Atmospheric Diffuse Light factor, Default = 1.2)
--galext # (Galactic light extinction factor, Default = 0.9)
--zodext # (Zodiacal light extinction factor, Default = 0.6)- 1. Reduction
- 2. Registration
- 3. Pointing Error
- 4. Zeropoint and Extinction
- 5. Median Filter
- 6. Galactic and Zodiacal Coordinates
- 7. Galactic Mosaic
- 8. Zodiacal Mosaic
- 9. Full-resolution Mosaic
- 10. Median-filtered Mosaic
- Public domain
This script performs basic image reduction, including corrections for bias, dark, flat, and linearity response of the detector.
process_images.py > reduce_images() > reduce.py > reducev() and reduceb()
In the beginning of data collection, for each data set, 5 dark (
where
where
This script will register the pointing of the images to the sky coordinates by matching the position of standard stars captured in the image to the position from the Tycho2 catalog. This script uses the PinPoiot.Plate object (through ACP Observatory Software) for the matching process.
process_images.py > register_coord() > register.py > matchstars()
The script first reads in the reduced images. If the image contains the horizon, the script will mask the part of the image below ~2 degrees in elevation. Then, the script uses the PinPoiot.Plate object (through ACP Observatory Software) to compare the position of the standard stars in the images to the Tycho2 catalog. If the image cannot be solved, it will try solving the image with only the central 200 x 200 pixels. If the images still can't be solved, it will be skipped. The script will return a list of files solved with the cropped images and a list of files that are failed to be solved. If an image is solved successfully, the image header will be updated with the solved coordinate information.
This script calculates the pointed azimuth and altitude using the solved RA and Dec values from the image headers. If the images are not solved, the RA, Dec, AZ, and ALT values are interpolated. The output from this script will be used later for making the mosaic.
process_images.py > pointing_error() > pointing.py > pointing_err()
It reads in the solved RA and Dec values from the image header, updates the coordinates to the observed date, translates to the azimuth and altitude given the LAST and the longitude, interpolates RA, Dec, AZ, and ALT values if the images are not solved, updates the headers, and records these values in a text file.
This script finds the best-fit extinction coefficient and the instrumental zeropoint. These values are needed for the photometric calibration later.
process_images.py > fit_zeropoint() > extinction.py > extinction()
This script finds the best-fit extinction coefficient and the instrumental zeropoint by identifying the standards stars in the images using PinPoiot.Plate object, measuring their background-subtracted flux in [DN/s] through aperture photometry with source aperture radius = 3 pixels and ring-shaped background with radius extending from 4 to 8 pixels, computing their airmass through their elevation, comparing the measured flux to their absolute magnitude, and finally fitting for the extinction given the airmass and M-m for all of the standard stars. The best-fit was found through numpy.polyfit() with the uncertainties of the best-fit parameters estimated using the covariance matric. This script outputs a list of the standard stars used for fitting, a graphical display of the fitting result, and the best-fit extinction coefficient and zeropoint.
This script applies median filter to each image to remove the stars. The filtered images will show the sky background brightness.
process_images.py > apply_filter() > medianfilter.py > filter()
We first read in the plate scale from the header of a solved image. Then we set the filter mask radius to be ~ 0.5 degree, which translates to a radius of 18 pixels for ML3 and ML4 CCD cameras. This filter size was selected to ensure most (or all) point sources are effectively filtered out, and the median value in that filtered window is a good representation for the background brightness. This script uses multiprocessing to speed up the filtering process for all the images. Here, MaxIM DL is used to convert the filtered fits images to tiff images so that they are compatible with ArcGIS to make mosaics in the future.
This script calculates ecliptic and galactic coordinates and rotation angles of the images. These output coordinates will be used in producing natural sky model for the given location, date, and time specific to the data set.
process_images.py > compute_coord() > coordinates.py > galactice_ecliptic_coords()
The script reads in the longitude and latitude from the image header. It also reads in the registered image coordinates in azimuth and altitude from the input file. Then, it uses ACP to calculate the corresponding galactic and zodiacal coordinates. The output coordinates are saved in a text file for the future use of making the mosaic images of galactic and zodiacal model.
This script makes the whole sky mosaic from the full resolution images according to the location of observed sky. This script also generates ArcGIS shapefiles used to define the spatial extent of each image. The Median, Galactic, and Zodiacal mosaic steps use these shapefiles as inputs to perform image clipping and match exactly the per-image extent of the full-resolution mosaic.
process_images.py > mosaic_full() > fullmosaic.py > mosaic()
This script reads in the full resolution tiff files and the pointing error file. It also uses some premade raster templates from the raster folder. We use arcpy (ArcGIS) to manipulate the images with projection, removal of distortion, clipping, and mosaic. Note that the mosaic raster list must start with an image with maximum pixel value being greater than 256 to avoid “no data” in the final mosaic. We then read in the zeropoint from the input file and use it to convert the mosaic from the raw unit in data number (DN) to calibrated unit in magnitudes per square arcsecond using the following equation:
where
This script makes the whole sky mosaic from the median filtered images according to the location of observed sky.
process_images.py > mosaic_median() > medianmosaic.py > mosaic()
Same as the method section under [“full-resolution mosaic”](### Full-resolution Mosaic) with the exception of using median filtered images as the input. This module also creates the initial mask.tif file, used for generating the terrain mask.
This script makes the whole sky mosaic of the galactic model according to the time and location of the observed sky.
process_images.py > mosaic_galactic() > galactic.py > mosaic()
This script reads in some premade raster templates from the raster folder. Then it reads in the galactic coordinates and the pointing error from the input files. We use arcpy (ArcGIS) to manipulate the images with rotation, projection, clipping, and mosaic. The output raster and layer files are stored in the Griddata folder.
This script makes the whole sky mosaic of the zodiacal model according to the time and location of the observed sky.
process_images.py > mosaic_zodiacal() > zodiacal.py > mosaic()
This script reads in some premade raster templates from the raster folder. Then it reads in the zodiacal coordinates and the pointing error from the input files. We use arcpy (ArcGIS) to manipulate the images with rotation, projection, clipping, and mosaic. The output raster and layer files are stored in the Griddata folder.
The naturalsky.py script is somewhat unique compared to the process_images.py and process_metrics.py routines in that all of the sub-routines are internal to the naturalsky.py script as class objects rather than broken out into individual python scripts.
- 1. Finalize Terrain Mask
- 2. Airglow Model
- 3. A.D.L. Model
- 4. Galactic Model
- 5. Zodiacal Light Model
- 6. Observed Sky Brigtness
- 7. Combined Natural Sky Model
- 8. Anthropogenic Light Mosaic
- 9. Mosaic Analysis
This class finalizes the terrain mask so it can be used for clipping/masking procedures in later steps.
naturalsky.py > Mask()
This class reads in the mask.tif file after it has been edited with Adobe Photoshop. We use arcpy (ArcGIS) to project the mask into the fisheye equal area coordinate system and set values below the horizon to NoData and values above the horizon to zero. The final mask raster, maskd.tif, is stored in the Griddata folder.
To generate the airglow component of the natural sky model.
naturalsky.py > Airglow()
This class uses arcpy (ArcGIS) to generate the airglow brightness model according to the van Rhijn equation (Leinert et al. 1998) and applying atmospheric extinction. The extincted model is saved to the Griddata directory. This module also saves all natural sky model input parameters to the excel sheet natsky_model_params.xlsx in the Calibdata directory. The van Rhijn equation is written as:
where
To generate the atmospheric diffuse light component of the natural sky model.
naturalsky.py > ADL()
This class uses arcpy (ArcGIS) to load in a pre-generated all-sky model of atmospheric diffuse light for Mauna Kea, Hawaii (see Kwon et al. 2004 and Duriscoe et al. 2013) and applies a linear scaling factor.
To generate the galactic component of the natural sky model.
naturalsky.py > Galactic()
This class uses arcpy (ArcGIS) to load in the Galactic mosaic generated in a previous step and apply atmospheric extinction scaled by a linear constant. The extincted model is saved to the Griddata directory.
To generate the zodiacal light component of the natural sky model.
naturalsky.py > Zodiacal()
This class uses arcpy (ArcGIS) to load in the Zodiacal light mosaic generated in a previous step and apply atmospheric extinction scaled by a linear constant. The extincted model is saved to the Griddata directory.
To load in the observed sky brightness mosaic and prepare it for subtraction of the natural sky model.
naturalsky.py > Skybright()
This class uses arcpy (ArcGIS) to load in the observed median-filtered sky brightness mosaic, convert pixel values from mag/arcsec$^2$, project the mosaic into the fisheye equal-area coordinate system, and apply the finished terrain mask. This prepares the observed sky brightness mosaic to have the natural sky model subtracted. The masked mosaic is saved to the Griddata directory.
To add the component models into a combined natural sky model.
naturalsky.py > AggregateModel()
This class uses arcpy (ArcGIS) to add together the component natural sky models (Airglow, A.D.L, Galactic, and Zodiacal) into a combined natural sky model, project the resulting model into the fisheye equal-area coordinate system, and apply the finished terrain mask. This prepares the natural sky model to be subtracted from the observed sky brightness mosaic. The combined natural sky model is saved to the Griddata directory.
To produce the anthropogenic light mosaic.
naturalsky.py > SkyglowModel()
This class uses arcpy (ArcGIS) to subtract the combined natural sky model from the observed sky-brightness mosaic. The resulting mosaic contains only light inferred to come from anthropogenic sources and is saved to the Griddata directory.
To provided statistical assessments of the observed and anthropogenic light mosaics along with a summary figure used to assess the quality of the model fit.
naturalsky.py > MosaicAnalysis()
This class uses arcpy (ArcGIS) and numpy to calculate various statistics on the pixel values of both the observed sky brightness and anthropogenic light mosaics. These stats are saved within the natsky_model_params.xlsx excel sheet in the night's Calibdata directory. This class also uses matplotlib to generate a summary figure (natsky_model_fit.png) saved in both the Griddata and Graphics directories that can be used to visually assess the quality of the natural sky model fit.
- 1. All Light Source Metrics
- 2. Anthropogenic Light Metrics
- 3. Number of Visible Stars
- 4. ALR Model
- 5. Albedo Model
- 6. Nearby Places
- 7. Sky Quality Metrics
- 8. Panorama and Illumination Graphics
- 9. Summary Tables
This script computes sky luminance and illuminance statistics for all light sources using the skybright median filtered mosaic.
process_metrics.py > process_illumall() > illumall.py > calculate_statistics()
The script uses arcpy (ArcGIS) to read in the median-filtered observed sky brightness mosaic and compute various luminance and illuminance metrics in several sky regions or zones. Metrics include horizontal and vertical illuminance, scalar illuminance, and the min/max/mean all-sky luminance down to various horizon and Zenith angle limits.
This script computes sky luminance and illuminance statistics for anthropogenic light sources only using the anthropogenic light mosaic.
process_metrics.py > process_skyglow() > skyglow.py > calculate_statistics()
The script uses arcpy (ArcGIS) to read in the anthropogenic light mosaic generated by naturalsky.py and compute various luminance and illuminance metrics in several sky regions or zones. Metrics include horizontal and vertical illuminance, scalar illuminance, and the min/max/mean all-sky luminance down to various horizon and Zenith angle limits.
This script computes the number of stars visible to the naked eye given the observed sky brightness, atmospheric extinction, and local terrain mask.
process_metrics.py > process_starsvis() > starsvis.py > calculate_stars_visible()
The script uses arcpy (ArcGIS) to read in a catalog of bright stars (SAOJ200079.shp) containing information about the on-sky positions and calibrated brightness in magnitude units of each star. The stars' magnitudes are corrected for atmospheric extinction and then compared to the limiting magnitude of both the observed sky and the natural sky model. Stars with extincted magnitudes
This script extracts the site-specific ALR model value from a pre-generated ALR model.
process_metrics.py > process_alrmodel() > alrmodel.py > calculate_alr_model()
The script uses arcpy (ArcGIS) to read in a pre-generated geo-referenced ALR model (alrmodel) made using VIIRS satellite imagery from the year 2014 (see Duriscoe et al. 2018 for model details). The longitude and latitude of the observing site are used to extract the ALR value from the model corresponding to the given location.
This script computes the site-specific albedo (surface reflectance) model value.
process_metrics.py > process_albedomodel() > albedomodel.py > calculate_albedo_model()
The script uses arcpy (ArcGIS) to read in a pre-generated geo-referenced albedo model (ws_albedo) covering the continental United States. The longitude and latitude of the observing site are used to extract the albedo value from the model corresponding to the given location.
This script computes distances and Walker's law values to nearby places using 2010 Census Data.
process_metrics.py > process_places() > places.py > calculate_places()
The script loads in 2010 Census population data (Places21k.xlsx) and uses the longitude and latitude of the observing site to calculate great-circle distances to each census place. The script uses the distance and population to calculate Walker's Law values (Walker 1977) for each place within 450 km of the observing site. Walker's Law is a simple numerical metric that assesses the impact of a population center on light pollution in a given area and is defined as:
where cities.xlsx) within 450 km and with
This script computes sky quality index (SQI) and synthetic sky quality meter (SQM) values, along with a few photometric indicators using simple aperture photometry measurements.
process_metrics.py > process_skyquality() > skyquality.py > calculate_sky_quality()
The script loads in the pixel histogram tables generated during the Skyglow step in order to calculate Sky Quality Index (SQI) values for a variety of horizon and Zenith angle limits. This script also computes simple aperture photometry on the calibrated FITS images in order to derive synthetic Sky Quality Meter (SQM) values for the observing site, taking into account the sky background brightness along with light from bright stars and planets. Simple aperture photometry is also used to calculate some Zenith luminance metrics.
This script generates the final panoramic all-sky images and vertical illumination graphics.
process_metrics.py > process_drawgraphics() > drawgraphics.py > generate_graphics()
The script uses arcpy (ArcGIS) to load the full-resolution, median filtered, natural sky model, and anthropogenic light mosaics, place each mosaic into a map template designed to display panoramic images, and then export each map to a JPEG image in the Graphics folder. This script also loads in the vertical illumination data from the Illumall and Skyglow steps to generate a figure comparing the vertical illumination trends from all light sources and from artificial light only, also placed in the Graphics folder. Graphics are produced for each individual data set.
To collect and summarize all of the sky brightness metrics within a single Excel file, while properly formatting the data for import into an Access database.
process_metrics.py > process_tables() > savetables.py > generate_tables()
The script takes all of the sky brightness metrics calculated in previous steps, along with additional site metadata and clibration info, and places them within a single Excel workbook, with multiple spreadsheets, saved into the Tables directory. This Excel workbook is formatted so that it can be properly imported into the Access database used to keep track of all NSNSD CCD data processing results.
This project is in the worldwide public domain. As stated in CONTRIBUTING:
This project is in the public domain within the United States, and copyright and related rights in the work worldwide are waived through the CC0 1.0 Universal public domain dedication.
All contributions to this project will be released under the CC0 dedication. By submitting a pull request, you are agreeing to comply with this waiver of copyright interest.



