Recent Changes - Search:

Menu

editer le Menu

IlluminaGuide2016

ILLUMINA V1 USER'S GUIDE

Martin Aubé, Ph.D. & Alexandre Simoneau M.Sc. copyright 2016

Latest update: 2020-02-11

This version is deprecated. Please refer to IlluminaGuide2021 for the most up-to-date one.

General informations

This users guide aim to help users of the Illumina sky brightness model to prepare and manage their own simulations. We hope that the document is accurate enough but will be happy to improve it according to some difficulties you may encounter when using it. For any help please contact the PI Dr Martin Aubé (martin.aube at cegepsherbrooke.qc.ca). This guide is the most recent one that incorporate recently added features to the model like the hyperspectral support, the subgrid obstacles filling factor and the contribution of the cloud reflexion.

Operating system

Illumina should be used with a computer running under Linux with Fortran and gcc compilers (e.g. gfortran) and mercurial versioning system installed. A few convenience scripts also require Python (2.7).

Other software dependencies

The following softwares are needed by the system

  • make
  • bash
  • vim-common
  • python-numpy
  • python-pyproj
  • python-matplotlib
  • python-gdal
  • python-pyfits
  • python-scipy
  • gri
  • bc
  • imagemagick
  • gnuplot
  • mercurial
  • tix
  • gdal

Other python libraries might be required.update

In all cases, the most recent version of the code should be used. The code is evolving rapidly and then by updating your version frequently, you will benefit of new features and bug fixes.

Installing the code

ILLUMINA model is available from bitbucket:

https://bitbucket.org/aubema/illumina

All sources codes are released under GNU public licence.

To install the model from bitbucket, please follow these steps:

If you are a non developer user: (:source:) cd mkdir hg cd hg git clone https://github.com/aubema/illumina.git (:sourceend:)

update

If you don't respect exactly the directory structure, the programs may fail to execute properly.

Then you must select the desired version and compile the code:

(:source:) cd cd hg/illumina git checkout v1 bash makeILLUMINA (:sourceend:)

Then modify the $HOME/.bashrc file by typing the following commands in the terminal window. This will make the programs executable from anywhere on the file system.

(:source:) echo "PATH=$PATH:$HOME/hg/illumina/:$HOME/hg/illumina/bin" >> $HOME/.bashrc source $HOME/.bashrc (:sourceend:)

If you are graphycs1 developer, you also want to edit the PYTHONPATH environment variable: (:source:) echo "export PYTHONPATH=$PYTHONPATH:$HOME/hg/illumina/" >> $HOME/.bashrc source $HOME/.bashrc (:sourceend:)

Downloading and preparing the required satellite images

ILLUMINA requires some satellite data to run properly, namely a digital elevation model, the ground reflectance at multiple wavelength and the nocturnal light emittance. Theses data also need to be projected in a suitable spatial reference system and clipped to the simulation domain.

Domain definition

Defining the simulation domain is crucial in the input preparation, as it directly affects everything afterwards. The first step is to define the location(s) where the simulation of the artificial sky brightness is desired. Then, the projection needs to be defined, as the model need to work with coordinates in meters instead of degrees. For this, the EPSG.io website can be used. Simply search for the country in which the simulation point(s) is/are located and select a projection that covers the region with sufficient accuracy. It is then needed to find the limits of the desired domain in this reference system. The coordinate transform tool on the above website can be useful in that regard.

One must create a parameters file with the ini extension to define the domain. It should be of the following format: (:source:) srs: epsg:3750 bbox: 517107 2047513 1017107 2447513 pixsize: 1000 (:sourceend:) where srs is the spatial reference system in a format supported by the gdal program, bbox is the domain extent, with the coordinate of the left, bottom, right and top borders respectively and finally pixsize is the desired resolution of the pixels in meters.

Alternatively, the defineDomain.py script can be used to generate the domain.ini file. The code ask for the coordinates of the observer and then defines a domain centered around it. It also automatically suggest a valid projection to use.

VIIRS-DNB imagery

The night emittance is obtained from VIIRS-DNB imagery that can be found here. One should download the appropriate tile(s) for the period desired (year and month). We suggest the VCMSLCFG configuration in the monthly composite product because of the stray light correction, but the choice is left to the user. You will want to extract the 'avg_rade9.tif' file as it's the actual values, whereas the 'cf_cvg.tif' file contain information related to the quality of the image. The tif file(s) should be placed inside a subfolder named VIIRS-DNB inside your experiment directory.

SRTM data

The digital elevation model is made with the SRTM elevation data that can be found here. One should use the spatial filter to select only the required tiles, and then follow the download procedure. The extracted hgt files should be placed inside a subfolder named SRTM inside your experiment directory.

To extract multiple archives at once, one can use unzip "*.zip"

MODIS reflectance data

The reflectance data is obtained from MODIS imagery (MYD09A1 Version 6 product). The data can be found here. One should use the spatial and temporal filter to select only the required tiles for the desired period, and then follow the download procedure. It's a good idea to check for the presence of clouds, as they can affect the data. The hdf files should be placed inside a subfolder named MODIS inside your experiment directory.

To create the best inventory, you must choose the MODIS date to fit as much as possible the VIIRS-DNB date.

Processing the input images

The Illuminutils.py script should be executed from the experiment directory containing all three data subfolders explained above.

Seven files should be produced by this script:

  • row_column_to_x_y_lon_lat.csv
  • stable_lights.bin
  • srtm.bin
  • refl_b01.bin
  • refl_b02.bin
  • refl_b03.bin
  • refl_b04.bin

The csv files allows for easy transformation between lat/lon coordinate systems to pixel position.

Img:hawaii_viirs.jpgImg:hawaii_srtm.jpgImg:hawaii_modis.jpg
VIIRSSRTMMODIS
stable_lights.binsrtm.binrefl_b0x.bin

Sample files for Hawaii

The standard format is raw binary and the pixel size is constant in meters (1000m per pixel).

Making light inventory for each zone

We have to create an inventory file. In Illumina, inventory files allow the user to define different geographical zones with different mix of type of lamps (different by their photometry function or light output pattern (LOP), their spectrum, lamp height), average distance between obstacles, obstacles height and obstacle filling factor). Two or more zones may be in the same geographical region or partly overlapping. All zones are defined as circle shaped zones characterized by a center position and a radius. Each new zone overwrite the previous in case of intersection between the zones. All the points that are not included in a zone will be ignored. To create a zone you have to edit an ASCII file with a simple text editor like kwrite or gedit following the format shown below:

Sample inventory file for the Hawaii territory

(:source:)

  1. lat lon R hobs dobs fobs hlamp Zone inventory Comment

21.4474 -157.9712 50 7 25 0.5 7 90_H_5 10_M_10 # Oahu 21.0052 -157.0123 40 7 25 0.5 7 90_H_5 10_M_10 # Molokai + Lanai 20.7764 -156.1512 64 7 25 0.5 7 18_H_10 72_H_0 10_M_10 # Maui 19.6468 -155.5714 103 7 25 0.5 7 87_L_10 8_H_10 5_M_5 # Big Island 19.2878 -155.2179 23 7 25 0.5 7 0_L_0 # Lava (:sourceend:)

This file can have any number of header lines as long as they are preceded by a '#' symbol. Anything on the same line following this symbol will not be considered by the model. Each data line contains several parameters:

  1. lat: central latitude of the circular zone.
  2. lon: central longitude of the circular zone.
  3. R: radius of the circular zone (pixel).
  4. hobs: subgrid averaged obstacles height (in meters)
  5. dobs: averaged distance between subgrid obstacles (in meters)
  6. fobs: obstacle filling factor i.e. probability for a photon to hit an obstacle (0. to 1.)
  7. hlamp: averaged lamps height relative to the ground (in meters)
  8. list of lamps characteristics

Each lamp characteristics is composed of three fields separated by '_'.

  • The first field is the weight of the zone that is defined by the following two characteristics. The weight is later converted to a ratio by normalizing to the sum of all the weight for that zone.
  • The second is a reference word corresponding to the spectral power distribution of the lamp*.
  • The third is a reference word corresponding to the angular power distribution of the lamp or light ouput pattern (LOP)*.

The weighting is applied on the luminous flux of the spectral power distribution of the lamp in lumen. This means that the spectrums are weighted by the photopic sensitivity curve.

Example


If one zone is composed of 50% of HPS with the angular photometry toto1_fctem.dat, 20% of HPS with phtotometry file toto2_fctem.dat, and 30% of LED4000K with photometry file toto3_fctem.dat and let assume that you use the spectral power distributions provided in the Example/Lights directory HPS_Helios.spct and 4LED_LED-4000K-Philips.spct to create the light inventory.

Then you should write 50_HPS_toto1 20_HPS_toto2 30_4LED_toto3 at column 8 of your inventory file for that zone.

The data referenced by the last two fields must be located in a subfolder named 'Lights'. This folder must contain the following files in addition to the ones used to define the lamp inventory :

  • photopic.dat
  • scotopic.dat
  • viirs.dat

Theses files can be found in the Illumina installation directory (Examples/Lights). Any additional file used to characterize the lamp must follow the following format :

  • Angular light output pattern (LOP) files must have the extension '.lop'. They are made of two columns ASCII data where the first column is the relative intensity and the second is the zenital angle in degree. The lop file must contain 181 data starting at z=0 to z=180 at 1 deg. step.
  • spectral files must have the extension '.spct'. They are two columns ASCII data files with a single line header. The first column contains the wavelength in nm and the second contains the relative intensity.

The normalization of all theses files is not important, as it will be done by the programs.

*In all cases, any characters preceding the first underscore (_) in the lop or spct file name is the reference word that must be written in the inventory file.

All .spct files must have the same wavelength scale. All LOP Files must share the same angle scale.

This inventory file can be generated from a list of geolocalised lights sources with individual characteristics written in the following format:

(:source:)

  1. lat lon pow hobs dobs fobs hlamp spct lop

21.295693 -157.856846 250 20 25 0.9 7 MH 5 21.295776 -157.856782 150 20 25 0.9 7 LED 0 21.295844 -157.857114 100 30 30 0.85 7 MH 5 21.286488 -157.845900 100 50 10 0.3 10 LED 1 (:sourceend:)

where

  1. lat: Latitude of the light source
  2. lon: Longitude of the light source
  3. pow: Intensity of the source in lumen
  4. hobs: Averaged obstacles height (in meters)
  5. dobs: Averaged distance between obstacles (in meters)
  6. fobs: Obstacle filling factor i.e. probability for a photon to hit an obstacle (0. to 1.)
  7. hlamp: Light source height relative to the ground (in meters)
  8. spct: Spectral power distribution keyword
  9. lop: Angular power distribution keyword

This file and the domain definition file from step 5.1 can then be used with the `point_source_to_zon.py` script to generate an inventory file to wich uniform zones can be added. Keep in mind that the zone defined last has priority.

Create the reflectance file list

We are using MODIS reflectance product in 4 bands. The reflectance at a given wavelength will be interpolated or extrapolated using the MODIS data. One needs to create a file to associate the modis files to the correct wavelength, where the first line tells you how many reflectance images you have. The names are the file name excluding the path to those files. The path will be needed later. The wavelength are in nanometer. This file is usually named modis.dat but any other name can be used.

(:source:) 4 469 refl_b03.bin 555 refl_b04.bin 645 refl_b01.bin 858 refl_b02.bin (:sourceend:)

Run make_inputs script

Run the script make_inputs.py with files created at steps 5.5, 6.1 and 6.2 as summarized below:

  1. Inventory file name created at step 6.1
  2. Domain definition file name created at step 5.1
  3. Number of wavelengths to model. These will be linearly spaced and will use the central wavelength of the bin. NB: The bins should be at least 1nm wide.
  4. Minimal wavelength used
  5. Maximal wavelength used
  6. The script then ask if you want to generate the inputs for the pre-processing program. Most of the time you will want to answer with Y, or simply press ENTER since it's the default argument.
  7. Output root name of the experiment: this name will be used to create the output light spectral flux files (_XXX_lumlp_NNN.bin) will be appended, with XXX giving the wavelength in nm and NNN giving the zone number.
  8. viirs file (binary Illumina format): path to the `stable_lights.bin` file name of step 5.5
  9. reflectance files list file: file name of step 6.2
  10. path of the directory in which the files referenced by the previous file are located
  11. srtm file: path to the `srtm.bin` file name of step 5.5
  12. If you want to remove the background noise in the VIIRS-DNB data. Default is YES.
  13. If you said YES to the previous question: The cutoff value in physical units. The default is calculated by locating the most common value and looking at the largest value that is a fourth as frequent as the most common one.
  14. aerosol characterization: either 'maritime', 'rural' or 'urban' followed by an underscore (_), the letters 'RH' and a 2 digit percentage qualifying the relative humidity of the region. Currently the only available choices are '00', '50', '70', '80', '90', '95', '98' and '99'.
  15. The script then ask if you want to execute the pre-processing program, viirs2lum. Most of the time you will want to answer with Y, or simply press ENTER since it's the default argument.

Example run

Here is the input needed to execute the make_inputs.py script for the 'Example' directory present in the Illumina installation folder :

  1. inventory.txt
  2. domain.ini
  3. 5
  4. 380
  5. 830
  6. Y
  7. Hawaii
  8. stable_lights.bin
  9. modis.dat
  10. .
  11. srtm.bin
  12. Y
  13. (leave empty, using default value)
  14. maritime_RH70
  15. Y

Quick file check

The script should produce two directories, 'Lamps' and 'Intrants', and two files, 'exp_name.zon' and 'integration_limits.dat', where exp_name is the experiment name, a parameter given at step 7 in the 'make_inputs.py' script. The 'Lamps' directory should contain three files for each zone defined in the inventory file : The angular and spectral photometry of each zone in binary (.bin), ASCII (.dat) and plotted (.png). The 'Intrants' directory should contain N*X 'fctem_wl_XXX_zon_NNN.dat' files, where N is the number of zones and X the number of wavelength used. In the case of the example, both these numbers are 5, and so you should have 25 'fctem' files. You should also have 'lumlp' files, one for each wavelength and zone combination plus one for each wavelength, giving the global view. There also must be 4 files named 'exp_name_altlp.bin', 'exp_name_obstd.bin' 'exp_name_obsth.bin' and 'exp_name_obstf.bin'. You should also find one 'modis_XXX.bin' file per wavelength and two '.lst' files. In addition to all that, you should see some symbolic links, one for 'integration_limits.dat', one for 'srtm.bin' and one '.mie.out' for each wavelength. For the example given, that represent a total of 72 files.

You can open all the _XXX_lumlp_NNN.bin files with imagemagick to confirm that they are realistic according to the specific experiment. lumlp files are giving the total lamp spectral flux for any pixel of a zone at the given wavelength.

Example run at 605nm with contrast boosted

Img:Hawaii_lumlp_605_tot.jpgImg:Hawaii_lumlp_605_1.jpg
Total lumlpZone 1 = Oahu
Img:Hawaii_lumlp_605_2.jpgImg:Hawaii_lumlp_605_3.jpg
Zone 2 = Molokai and LanaiZone 3 = Maui
Img:Hawaii_lumlp_605_4.jpgImg:Hawaii_zones_map.png
Zone 4 = Big IslandMap of the zones made with free map tools

Zone 5, wich is the lava lakes, is all black because its light isn't considered in the model. To achieve this, its lamp inventory was empty. You can verify that in the 'inventory.txt' file.

Preparation of the batch file

This file is a bash program that will generate a lot of running experiments according to the number of desired cases for each variable.

We must define the different variables that will be used by makeBATCH-multispectral.bash program. This is done by editing a file named makeBATCH.in (note that you can use another name). One file can be created for each experiment, if required. An sample makeBATCH.in file is provided below:

(:source:)

  1. Illumina's makeBATCH-multispectral.bash input file

batch_file_name TortureMammouth # this is the base name of the different script that will be submitted to the queue pixel_size 1000 # pixel size in meter experiment_name Hawaii # modeling experiment name pressure 101.3 # lowest domain level atmospheric pressure in KPa estimated_computing_time 120 # estimated computing time per case in hours terrain_elevation_file srtm.bin # bin file containing the elevation model relative_humidity 70 # atmospheric reltive humidity aerosol_model maritime # aerosol model (rural, urban, maritime) cloud_model 0 # cloud model selection 0=clear, 1=Thin Cirrus/Cirrostratus, 2=Thick Cirrus/Cirrostratus, 3=Altostratus/Altocumulus, 4=Cumulus/Cumulonimbus, 5=Stratocumulus nearest_source_distance 150 # minimal distance allowed to the nearest light source (m) 1_radius 27 # length of the side of a square within which the simulation resolution will be one pixel (full resolution) - must be an odd multiple of 9 (e.g. 9, 27, 45, 63, 81, ...) 3_radius 81 # length of the side of a square within which the simulation resolution will be tree pixels wide (pixel) - must be an odd multiple of 9 but larger than full resolution radius. Outside this radius the resolution will be nine pixels wide. stop_limit 15000. # Stop computation when the new voxel contribution is less than 1/stoplim of the cumulated flux (suggested value = 15000.) but for purely theoritical works (e.g. only one pixel on) you must increase this number at least to 1E10 x_positions 269 # list of x positions of the observer (pixel) y_positions 245 # list of y positions of the observer (pixel) Linked to x_positions z_positions 0. # list of z positions above ground of the observer (m) scattering_skip 71 # list of 2nd scattering acceleration factor (must me a prime number) scattering_radius 4000 # list of maximum 2nd scattering radius (m) elevation_angles 90 45 # list of elevation viewing angles (deg) azimuth_angles 0 10 # list of azimuth viewing angles aerosol_optical_depth 0.11 # list of AOD values at 500 nm angstrom_coefficients 0.7 # list of angstrom exponents values. Linked to angstrom_optical_depth (:sourceend:)

If you used Illumina prior to Dec 4 2017, you have to consider that z_position is now defined as the elevation above local ground level. It was previously the voxel number along z axis.

Submitting the calculations to a Linux cluster

To perform the calculations, we now connect to a 'Cluster'. In our case, we connected to 'Mammouth serial II' located at Université de Sherbrooke.

Then it is necessary to recompile the ILLUMINA model using 'makeILLUMINA' or through the following command:

(:source:) bash makeILLUMINA (:sourceend:)

The `Intrants` directory created for each experiment in step 6 should now be transferred to the cluster interactive node via the scp protocol.

Preparing the batch execution

Now we need to run the program makeBATCH-multispectral.bash that will run the multiple calculations for each experiment directory on the cluster.

(:source:) nohup bash makeBATCH-multispectral 'path_to_makeBATCH.in/filename_makeBATCH.in' & (:sourceend:)

  • Use the command qstat or bqmon -u to verify the status of the 'clusters' (compute nodes) before or during execution.
  • To delete a task, use the qdel followed by the job number to delete.
  • To delete all your jobs use the following command:

(:source:) list=`qstat @ms -u $(whoami)`;for i in $list; do if [ `echo $i | grep ms ` ] ; then echo $i;qdel $i;sleep 0.01; fi;done (:sourceend:)

To execute the calculations, perform the following command:

(:source:) bash 'output_batch_file' (:sourceend:)

This step should be repeated for each season or period and each relative humidity value.

Extracting results

ILLUMINA generates two image files by calculation, a file showing the contribution of each pixel to the calculated sky radiance (PCL) and the other illustrating the sensitivity to light pollution (PCW). It also produces the sky radiance value in the specified direction.

To extract the data, you have to create a list of files to extract. This is done by concatenating the 'output_batch_file' files produced in step 9. To do so go to the $HOME directory and type:

(:source:) cat 'output_batch_file'* > exp_name.list (:sourceend:)

Using the same string used in step 9. To extract the data simply execute

(:source:) extract-output-data.bash exp_name exp_name.list (:sourceend:)

At the end of the execution of extract-output-data.bash a Results directory has been created and it will contain all PCL files (ex .: PCL-x302y58-2005-ta0 .05-wl436-el15-az0.bin), all PCW files (ex.L PCW-x302y58-2005-ta0.05-wl436-el15-az0.bin) and .out files (the log of each execution).

The script 'extract_more.py' can be useful to get important information out of all the files produced by the previous script. Simply call it from where you want the results to be produced. It will ask you a few parameters :

  1. Results folder's path: The path to the directory containing the results folder.
  2. Integration_limits file path: Path to the file 'integration_limits.dat', generated by the 'make_input.py' script. '/integration_limits.dat' will be automatically appended to your input.
  3. Generate ratio graph: Answer 'n' to this, we will come back to it later.
  4. Generate fits data cubes: Whether or not to generate FITS cube containing the contribution map data for each wavelength.

Alternatives scenarios

Illumina can simulate alternative lighting scenarios using the same lights intensity normalized in photopic vision. To do this, you simply need to build your alternative lamps inventory in a similar manner as the current inventory. Alternative inventory files must be following the format shown below:

(:source:)

  1. Zone inventory

1_4LED_0 # Oahu 1_4LED_0 # Molokai + Lanai 1_4LED_0 # Maui 1_4LED_0 # Big Island 0_4LED_0 # Lava (:sourceend:)

Where you only have a similar lamp inventory as in the original inventory file, but without the zone characteristics. Once you created that file, you can execute the 'alt_scenario_maker.py' script. Here is what the needed input is:

  1. New name: A spaceless name identifying the scenario.
  2. Old inventory path: The path (name included) to the original inventory file
  3. New inventory path: The path (name included) to the new inventory file

Extracting results

If you simulated multiple scenarios, you must organize manually each results folder in a single Results directory, where the name of each subfolder ends with an underscore followed by a string identifying the scenario. For instance, you could have something like `Hawaii_current` and `Hawaii_LED`. You then can use the python script extract_more.py. This script will generate data files for each experiment parameters containing the LP spectrum for each scenario. It will also produce a few graphs comparing the scenarios and data cube in the FITS format for each experiment parameter group and scenario. These can then be opened by your favorite FITS program, for instance SAOImage DS9.

Quick mercurial using guide for developers

To publish your locally modified codes on the server

(:source:) hg commit -u graphycs1 -m "new printout" hg push (:sourceend:)

To update your local version of the codes

(:source:) hg pull hg update (:sourceend:)

Edit - History - Print - Recent Changes - Search
Page last modified on May 26, 2022, at 03:54 pm