fermipy.diffuse subpackage

The fermipy.diffuse sub-package is a collection of standalone utilities that allow the user to parallelize the data and template preparation for all-sky analysis.

The tools described here perform a number of functions:

  • Making binned counts maps and expousre maps over the whole sky.
  • Managing model components for all-sky analysis; including both diffuse emission and point source contributions. This includes make spatial-spectral templates and expected counts maps for various components.
  • Building intergrated models for a collection of model components.
  • Fitting those models.

Overview

This package implements an analysis pipeline prepare data and templates for analysis. This involves a lot of bookkeeping and loops over various things.It is probably easiest to first describe this with a bit of pseudo-code that represents the various analysis steps.

The various loop variables are:

  • Input data files

    For practical reasons, the input photon event files (FT1) files are split into monthly files. In the binning step of the analysis we loop over those files.

  • binning components

    We split the data into several “binning components” and make seperate binned counts maps for each components. A binning component is defined by energy range and data sub-selection (such as PSF event type and zenith angle cuts).

  • Diffuse model components

    The set of all the model components that represent diffuse emission, such as contributions for cosmic ray interactions with

  • Catalog model components

    The set of all the catalog sources (both point sources and extended source), merged into a few distinct contributions.

  • Diffuse emission model definitions

    A set of user defined models that merge the various model components with specific spectral models.

 # Data Binning, prepare the analysis directories and precompute the DM spectra

 # First we loop over all the input input files and split up the
 # data by binning component and bin the data using the command
 # fermipy-split-and-bin-sg, which is equivalent to:
 for file in input_data_files:
     fermipy-split-and-bin(file)

 # Then we loop over the binning components and coadd the binned
 # data from all the input files using the command
 # fermipy-coadd-split-sf, which is equivalent to:
 for comp in binning_components:
     fermipy-coadd-split(comp)

 # We also loop over the binning components and compute the
 # exposure maps for each binning component using the command
 # fermipy-gtexpcube2-sg, which is equivalent to:
 for comp in binned_components:
     gtexpcube2(comp)

 # We loop over the diffuse components that come from GALProp
 # templates and refactor them using the command
 # fermipy-sum-ring-gasmaps-sg, which is equivalent to
 for galprop_comp in diffuse_galprop_components:
    fermipy-coadd(galprop_comp)

# We do a triple loop over all of the diffuse components, all the
# binning components and all the energy bins and convolve the
# emission template with the instrument response using the command
# fermipy-srcmaps-diffuse-sg, which is equivalent to
for diffuse_comp in diffuse_components:
    for binning_comp in binned_components:
        for energy in energy_bins:
             fermipy-srcmap-diffuse(diffuse_comp, binning_comp, energy)

# We then do a double loop over all the diffuse components and all
# the binning components and stack the template maps into single
# files using the command
# fermipy-vstack-diffuse-sg, which is equivalent to
for diffuse_comp in diffuse_components:
    for binning_comp in binned_components:
        fermipy-vstack-diffuse(diffuse_comp, binning_comp)

 # We then do a double loop over source catalogs and binning
 # components and compute the templates for each source using the
 # command
 # fermipy-srcmaps-catalog-sg, which is equivalent to
 for catalog in catalogs:
    for binning_comp in binned_components:
        fermipy-srcmaps-catalog(catalog, binning_comp)

 # We then loop over the catalog components (essentially
 # sub-sets of catalog sources that we want to merge)
 # and merge those sources into templates using the command
 # fermipy-merge-srcmaps-sg, which is equivalent to
 for catalog_comp in catalog_components:
    for binning_comp in binned_components:
        fermipy-merge-srcmaps(catalog_comp, binning_comp)

 # At this point we have a library to template maps for all the
 # emision components that we have defined.
 # Now we want to define specific models.  We do this
 # using the commands
 # fermipy-init-model and fermipy-assemble-model-sg, which is equivalent to
 for model in models:
     fermipy-assemble-model(model)

 # At this point we, for each model under consideration we have an
 # analysis directory that is set up for fermipy

Configuration

This section describes the configuration management scheme used within the fermipy.diffuse package and documents the configuration parameters that can be set in the configuration file.

Analysis classes in the dmpipe package all inherit from the fermipy.jobs.Link class, which allow user to invoke the class either interactively within python or from the unix command line.

From the command line

$ fermipy-srcmaps-diffuse-sg --comp config/binning.yaml --data config/dataset_source.yaml --library models/library.yaml

From python there are a number of ways to do it, we recommend this:

from fermipy.diffuse.gt_srcmap_partial import SrcmapsDiffuse_SG
link = SrcmapsDiffuse_SG( )
link.update_args(dict(comp='config/binning.yaml', data='config/dataset_source.yaml',library='models/library.yaml'))
link.run()

Top Level Configuration

We use a yaml file to define the top-level analysis parameters.

Sample top level Configuration
 # The binning components
 comp : config/binning.yaml
 # The dataset
 data : config/dataset_source.yaml
 # Library with the fitting components
 library : models/library.yaml
 # Yaml file with the list of models to prepare
 models : models/modellist.yaml
 # Input FT1 file
 ft1file : P8_P305_8years_source_zmax105.lst
 # HEALPix order for counts cubes
 hpx_order_ccube : 9
 # HEALPix order for exposure cubes
 hpx_order_expcube : 6
 # HEALPix order fitting models
 hpx_order_fitting : 7
 # Build the XML files for the diffuse emission model components
 make_diffuse_comp_xml : True
 # Build the XML files for the catalog source model components
 make_catalog_comp_xml : True
 # Name of the directory for the merged GALProp gasmaps
 merged_gasmap_dir : merged_gasmap
 # Number of catalog sources per batch job
 catalog_nsrc : 500

Binning Configuration

We use a yaml file to define the binning components.

Sample binning Configuration
coordsys : 'GAL'
E0:
    log_emin : 1.5
    log_emax : 2.0
    enumbins : 2
    zmax : 80.
    psf_types :
        PSF3 :
            hpx_order : 5
E1:
    log_emin : 2.0
    log_emax : 2.5
    enumbins : 2
    zmax : 90.
    psf_types :
        PSF23 :
            hpx_order : 6
E2:
    log_emin : 2.5
    log_emax : 3.0
    enumbins : 3
    zmax : 100.
    psf_types :
        PSF123 :
            hpx_order : 8
E3:
    log_emin : 3.0
    log_emax : 6.0
    enumbins : 9
    zmax : 105.
    psf_types :
        PSF0123 :
           hpx_order : 9
  • coordsys : ‘GAL’ or ‘CEL’
    Coordinate system to use
  • log_emin, log_emax: float
    Energy bin boundries in log10(MeV)
  • enumbins: int
    Number of energy bins for this binning component
  • zmax : float
    Maximum zenith angle (in degrees) for this binning component
  • psf_types: dict
    Sub-dictionary of binning components by PSF event type, PSF3 means PSF event type 3 events only. PSF0123 means all four PSF event types.
  • hpx_order: int
    HEALPix order to use for binning. The more familiar nside parameter is nside = 2**order

Dataset Configuration

We use a yaml file to define the data set we are using. The example below specifies using a pre-defined 8 year dataset, selecting the “SOURCE” event class and using the V2 version of the corresponding IRFs (specifically P8R3_SOURCE_V2).

Sample dataset Configuration
 basedir : '/gpfs/slac/kipac/fs1/u/dmcat/data/flight/diffuse_dev'
 data_pass : 'P8'
 data_ver : 'P305'
 evclass : 'source'
 data_time : '8years'
 irf_ver : 'V2'

The basedir parameter should point at the analysis directory. For the most part the other parameter are using to make the names of the various files produced by the pipeline. The evclass parameter defines the event selection, and the IRF version is defined by a combination of the data_ver, evclass and irf_ver parameters.

GALProp Rings Configuration

We use a yaml file to define the how we combine GALProp emission templates. The example below specifies how to construct as series of ‘merged_CO’ rings by combining GALProp intensity template predictions.

Sample GALProp rings Configuration
 galprop_run : 56_LRYusifovXCO5z6R30_QRPD_150_rc_Rs8
 ring_limits : [1, 2, 3, 4, 6, 8, 9, 10, 11, 12, 13, 15]
 diffuse_comp_dict :
     merged_CO : ['pi0_decay_H2R', 'bremss_H2R']
 remove_rings : ['merged_CO_7']
  • galprop_run : string
    Define the GALProp run to use for this component. This is used to make the filenames for input template maps.
  • ring_limits : list of int
    This specfies how to combine the GALProp rings into a smaller set of rings.
  • diffuse_comp_dict : dict
    This specifies how to make GALProp components into merged components for the diffuse analysis
  • remove_rings: list of str
    This allow use to remove certain rings from the model

Catalog Component Configuration

We use a yaml file to define the how we split up the catalog source components. The example below specifies using the FL8Y source list, and to split out the faint sources (i.e., those with the Signif_Avg value less that 100.), and the extended source, and to keep all the remaining sources (i.e., the bright, pointlike, sources) as individual sources.

Sample catalog component Configuration
 catalog_name : FL8Y
 catalog_file : /nfs/slac/kipac/fs1/u/dmcat/ancil/catalogs/official/4FGLp/gll_psc_8year_v4.fit
 catalog_extdir : /nfs/slac/kipac/fs1/u/dmcat/ancil/catalogs/official/extended/Extended_archive_v18
 catalog_type : FL8Y
 rules_dict :
     faint :
         cuts :
             - { cut_var: Signif_Avg, max_val : 100. }
             - mask_extended
     extended :
         cuts :
             - select_extended
     remainder :
         merge : False

Model Component Library

We use a yaml file to define a “library” of model components. The comprises a set of named emission components, and a set one or more versions for each named component. Here is an example library defintion file.

Sample Model Component Library Configuration
 # Catalog Components
 FL8Y :
     model_type : catalog
     versions : [v00]
 # Diffuse Components
 galprop_rings :
     model_type : galprop_rings
     versions : [p8-ref_IC_thin, p8-ref_HI_150, p8-ref_CO_300_mom, p8-ref_dnm_300hp]
 dnm_merged:
     model_type : MapCubeSource
    versions : ['like_4y_300K']
 gll-iem :
     model_type : MapCubeSource
     versions : [v06]
 loopI :
     model_type : MapCubeSource
     versions : [haslam]
 bubbles :
     model_type : MapCubeSource
     versions : [v00, v01]
 iso_map :
     model_type : MapCubeSource
     versions : [P8R3_SOURCE_V2]
 patches :
     model_type : MapCubeSource
     versions : [v09]
     selection_dependent : True
     no_psf : True
     edisp_disable : True
 unresolved :
     model_type : MapCubeSource
     versions : [strong]
 sun-ic :
     model_type : MapCubeSource
     versions : [v2r0, P8R3-v2r0]
     moving : True
     edisp_disable : True
 sun-disk :
     model_type : MapCubeSource
     versions : [v3r1, P8R3-v3r1]
     moving : True
     edisp_disable : True
 moon :
     model_type : MapCubeSource
     versions : [v3r2, P8R3-v3r2]
     moving : True
     edisp_disable : True
  • model_type: ‘MapCubeSource’ or ‘catalog’ or ‘galprop_rings’ or ‘SpatialMap’
    Specifies how this model should be constructed. See more below in the versions parameters.
  • versions: list of str
    Specifies different versions of this model component. How this string is used depend on the model type. for ‘MapCubeSource’ and ‘SpatialMap’ sources it is used to construct the expected filename for the intensity template. For ‘catalog’ and ‘galprop_rings’ it is used to construct the filename for the yaml file that defines the sub-components for that component.
  • moving: bool If true, then will use source-specific livetime cubes to constuct source templates for each zenith angle cut.
  • selection_dependent : bool If true, then will used different source templates for each binning component.
  • no_psf : bool Turns of PSF convolution for this source. Useful for data-driven components.
  • edisp_disable : bool Turns off energy dispersion for the source. Useful for data-driven components.

Spectral Model Configuration

We use a yaml file to define the spectral models and default parameters. This file is simply a dictionary mapping names to sub-dictionaries defining spectral models and default model parameters.

Model Defintion

We use a yaml file to define each overall model, which combine library components and spectral models.

Sample Model Definition Configuration
 library : models/library.yaml
 spectral_models : models/spectral_models.yaml
 sources :
     galprop_rings_p8-ref_IC_thin :
         model_type : galprop_rings
         version : p8-ref_IC_150
         SpectrumType :
             default : Constant_Correction
     galprop_rings-p8-ref_HI_300:
         model_type : galprop_rings
         version : p8-ref_HI_300
         SpectrumType :
             default : Powerlaw_Correction
             merged_HI_2_p8-ref_HI_300 : BinByBin_5
             merged_HI_3_p8-ref_HI_300 : BinByBin_5
             merged_HI_4_p8-ref_HI_300 : BinByBin_9
             merged_HI_5_p8-ref_HI_300 : BinByBin_9
             merged_HI_6_p8-ref_HI_300 : BinByBin_9
             merged_HI_8_p8-ref_HI_300 : BinByBin_5
             merged_HI_9_p8-ref_HI_300 : BinByBin_5
     galprop_rings-p8-ref_CO_300:
         model_type : galprop_rings
         version : p8-ref_CO_300_mom
         SpectrumType :
             default : Powerlaw_Correction
             merged_CO_2_p8-ref_CO_300_mom : BinByBin_5
             merged_CO_3_p8-ref_CO_300_mom : BinByBin_5
             merged_CO_4_p8-ref_CO_300_mom : BinByBin_9
             merged_CO_5_p8-ref_CO_300_mom : BinByBin_9
             merged_CO_6_p8-ref_CO_300_mom : BinByBin_9
             merged_CO_8_p8-ref_CO_300_mom : BinByBin_5
             merged_CO_9_p8-ref_CO_300_mom : BinByBin_5
     dnm_merged :
         version : like_4y_300K
         SpectrumType : BinByBin_5
     iso_map :
         version : P8R3_SOURCE_V2
         SpectrumType : Iso
     sun-disk :
         version : v3r1
         SpectrumType : Constant_Correction
         edisp_disable : True
     sun-ic :
         version : v2r0
         SpectrumType : Constant_Correction
         edisp_disable : True
     moon :
         version : v3r2
         SpectrumType : Constant_Correction
         edisp_disable : True
     patches :
         version : v09
         SpectrumType : Patches
         edisp_disable : True
     unresolved :
         version : strong
         SpectrumType : Constant_Correction
     FL8Y :
         model_type : Catalog
         version : v00
         SpectrumType :
             default : Constant_Correction
             FL8Y_v00_remain : Catalog
             FL8Y_v00_faint : BinByBin_9
  • model_type: ‘MapCubeSource’ or ‘catalog’ or ‘galprop_rings’ or ‘SpatialMap’
    Specifies how this model should be constructed. See more below.
  • version: str
    Specifies version of this model component.
  • edisp_disable : bool Turns off energy dispersion for the source. Useful for data-driven components. Needed for model XML file construction.
  • SpectrumType: str or dictionary This specifies the Spectrum type to use for this model component. For ‘catalog’ and ‘galprop_rings’ model types it can be a dictionary mapping model sub-components to spectrum types. Note that the spectrum types should be defined in the spectral model configuration described above.

Examples

Module contents

Configuration, binning, default options, etc…

Small helper class to represent the binning used for a single component of a summed likelihood in diffuse analysis

class fermipy.diffuse.binning.Component(**kwargs)[source]

Bases: object

Small helper class to represent the binning used for a single component of a summed likelihood in diffuse analysis

Parameters:
  • log_emin (float) – Log base 10 of minimum energy for this component
  • log_emax (float) – Log base 10 of maximum energy for this component
  • enumbins (int) – Number of energy bins for this component
  • zmax (float) – Maximum zenith angle cube for this component in degrees
  • mktimefilters (list) – Filters for gtmktime.
  • hpx_order (int) – HEALPix order to use for this component
  • coordsys (str) – Coodinate system, ‘CEL’ or ‘GAL’
classmethod build_from_energy_dict(ebin_name, input_dict)[source]

Build a list of components from a dictionary for a single energy range

classmethod build_from_yamlfile(yamlfile)[source]

Build a list of components from a yaml file

classmethod build_from_yamlstr(yamlstr)[source]

Build a list of components from a yaml string

emax

Maximum energy for this component

emin

Minimum energy for this component

evtype

Event type bit mask for this component

make_key(format_str)[source]

Make a key to identify this compoment

format_str is formatted using object __dict__

Analysis framework for all-sky diffuse emission fitting

Handle the naming conventions for composite likelihood analysis

class fermipy.diffuse.name_policy.NameFactory(**kwargs)[source]

Bases: object

Helper class to define file names and keys consistently.

angprofile(**kwargs)[source]

return the file name for sun or moon angular profiles

angprofile_format = 'templates/profile_{sourcekey}.fits'
bexpcube(**kwargs)[source]

return the name of a binned exposure cube file

bexpcube_format = 'bexp_cubes/bexcube_{dataset}_{mktime}_{component}_{coordsys}_{irf_ver}.fits'
bexpcube_moon(**kwargs)[source]

return the name of a binned exposure cube file

bexpcube_sun(**kwargs)[source]

return the name of a binned exposure cube file

bexpcubemoon_format = 'bexp_cubes/bexcube_{dataset}_{mktime}_{component}_{irf_ver}_moon.fits'
bexpcubesun_format = 'bexp_cubes/bexcube_{dataset}_{mktime}_{component}_{irf_ver}_sun.fits'
catalog_split_yaml(**kwargs)[source]

return the name of a catalog split yaml file

catalog_split_yaml_format = 'models/catalog_{sourcekey}.yaml'
ccube(**kwargs)[source]

return the name of a counts cube file

ccube_format = 'counts_cubes/ccube_{dataset}_{mktime}_{component}_{coordsys}.fits'
comp_srcmdl_xml(**kwargs)[source]

return the name of a source model file

comp_srcmdl_xml_format = 'analysis/model_{modelkey}/srcmdl_{modelkey}_{component}.xml'
component(**kwargs)[source]

Return a key that specifies data the sub-selection

component_format = '{zcut}_{ebin}_{psftype}'
dataset(**kwargs)[source]

Return a key that specifies the data selection

dataset_format = '{data_pass}_{data_ver}_{data_time}_{evclass}'
diffuse_template(**kwargs)[source]

return the file name for other diffuse map templates

diffuse_template_format = 'templates/template_{sourcekey}.fits'
evclassmask(evclass_str)[source]

Get the bitmask for a particular event class

ft1file(**kwargs)[source]

return the name of the input ft1 file list

ft1file_format = '{dataset}_{zcut}.lst'
ft2file(**kwargs)[source]

return the name of the input ft2 file list

ft2file_format = 'ft2_files/ft2_{data_time}.lst'
fullpath(**kwargs)[source]

Return a full path name for a given file

fullpath_format = '{basedir}/{localpath}'
galprop_gasmap(**kwargs)[source]

return the file name for Galprop input gasmaps

galprop_gasmap_format = 'gasmap/{sourcekey}_{projtype}_{galprop_run}.gz'
galprop_ringkey(**kwargs)[source]

return the sourcekey for galprop input maps : specifies the component and ring

galprop_ringkey_format = '{source_name}_{ringkey}'
galprop_rings_yaml(**kwargs)[source]

return the name of a galprop rings merging yaml file

galprop_rings_yaml_format = 'models/galprop_rings_{galkey}.yaml'
galprop_sourcekey(**kwargs)[source]

return the sourcekey for merged galprop maps : specifies the merged component and merging scheme

galprop_sourcekey_format = '{source_name}_{galpropkey}'
generic(input_string, **kwargs)[source]

return a generic filename for a given dataset and component

irf_ver(**kwargs)[source]

Get the name of the IRF version

irfs(**kwargs)[source]

Get the name of IFRs associted with a particular dataset

ltcube(**kwargs)[source]

return the name of a livetime cube file

ltcube_format = 'lt_cubes/ltcube_{data_time}_{mktime}_{zcut}.fits'
ltcube_moon(**kwargs)[source]

return the name of a livetime cube file

ltcube_sun(**kwargs)[source]

return the name of a livetime cube file

ltcubemoon_format = 'sunmoon/ltcube_{data_time}_{mktime}_{zcut}_moon.fits'
ltcubesun_format = 'sunmoon/ltcube_{data_time}_{mktime}_{zcut}_sun.fits'
make_filenames(**kwargs)[source]

Make a dictionary of filenames for various types

master_srcmdl_xml(**kwargs)[source]

return the name of a source model file

master_srcmdl_xml_format = 'analysis/model_{modelkey}/srcmdl_{modelkey}_master.xml'
mcube(**kwargs)[source]

return the name of a model cube file

mcube_format = 'model_cubes/mcube_{sourcekey}_{dataset}_{mktime}_{component}_{coordsys}_{irf_ver}.fits'
merged_gasmap(**kwargs)[source]

return the file name for Galprop merged gasmaps

merged_gasmap_format = 'merged_gasmaps/{sourcekey}_{projtype}.fits'
merged_sourcekey(**kwargs)[source]

return the sourcekey for merged sets of point sources : specifies the catalog and merging rule

merged_sourcekey_format = '{catalog}_{rulekey}'
merged_srcmaps(**kwargs)[source]

return the name of a source map file

merged_srcmaps_format = 'analysis/model_{modelkey}/srcmaps_{dataset}_{mktime}_{component}_{coordsys}_{irf_ver}.fits'
mktime(**kwargs)[source]

return the name of a selected events ft1file

mktime_format = 'counts_cubes/mktime_{dataset}_{mktime}_{component}.fits'
model_yaml(**kwargs)[source]

return the name of a model yaml file

model_yaml_format = 'models/model_{modelkey}.yaml'
nested_srcmdl_xml(**kwargs)[source]

return the file name for source model xml files of nested sources

nested_srcmdl_xml_format = 'srcmdls/{sourcekey}_sources.xml'
residual_cr(**kwargs)[source]

Return the name of the residual CR analysis output files

residual_cr_format = 'residual_cr/residual_cr_{dataset}_{mktime}_{component}_{coordsys}_{irf_ver}.fits'
select(**kwargs)[source]

return the name of a selected events ft1file

select_format = 'counts_cubes/select_{dataset}_{component}.fits'
sourcekey(**kwargs)[source]

Return a key that specifies the name and version of a source or component

sourcekey_format = '{source_name}_{source_ver}'
spectral_template(**kwargs)[source]

return the file name for spectral templates

spectral_template_format = 'templates/spectral_{sourcekey}.txt'
srcmaps(**kwargs)[source]

return the name of a source map file

srcmaps_format = 'srcmaps/srcmaps_{sourcekey}_{dataset}_{mktime}_{component}_{coordsys}_{irf_ver}.fits'
srcmdl_xml(**kwargs)[source]

return the file name for source model xml files

srcmdl_xml_format = 'srcmdls/{sourcekey}.xml'
stamp(**kwargs)[source]

Return the path for a stamp file for a scatter gather job

stamp_format = 'stamps/{linkname}.stamp'
template_sunmoon(**kwargs)[source]

return the file name for sun or moon template files

templatesunmoon_format = 'templates/template_{sourcekey}_{zcut}.fits'
update_base_dict(yamlfile)[source]

Update the values in baseline dictionary used to resolve names

Utilities and tools

Classes and utilities that manage spectral model specific to diffuse analyses

class fermipy.diffuse.spectral.SpectralLibrary(spectral_dict)[source]

Bases: object

A small helper class that serves as an alias dictionary for spectral models

classmethod create_from_yaml(yamlfile)[source]

Create the dictionary for a yaml file

classmethod create_from_yamlstr(yamlstr)[source]

Create the dictionary for a yaml file

update(spectral_dict)[source]

Update the dictionary

Small helper class to represent the selection of mktime filters used in the analysis

class fermipy.diffuse.timefilter.MktimeFilterDict(aliases, selections)[source]

Bases: object

Small helper class toselection of mktime filters used in the analysis

static build_from_yamlfile(yamlfile)[source]

Build a list of components from a yaml file

items()[source]

Return the itratetor over key, value pairs

keys()[source]

Return the iterator over keys

values()[source]

Return the itratetor over values

Classes and utilities that create fermipy source objects

class fermipy.diffuse.source_factory.SourceFactory[source]

Bases: object

Small helper class to build and keep track of sources

add_sources(source_info_dict)[source]

Add all of the sources in source_info_dict to this factory

static build_catalog(**kwargs)[source]

Build a fermipy.catalog.Catalog object

Parameters:
  • catalog_type (str) – Specifies catalog type, options include 2FHL | 3FGL | 4FGLP
  • catalog_file (str) – FITS file with catalog tables
  • catalog_extdir (str) – Path to directory with extended source templates
classmethod copy_selected_sources(roi, source_names)[source]

Build and return a fermipy.roi_model.ROIModel object by copying selected sources from another such object

static make_fermipy_roi_model_from_catalogs(cataloglist)[source]

Build and return a fermipy.roi_model.ROIModel object from a list of fermipy.catalog.Catalog objects

classmethod make_roi(sources=None)[source]

Build and return a fermipy.roi_model.ROIModel object from a dict with information about the sources

source_info_dict

Return the dictionary of source_info objects used to build sources

sources

Return the dictionary of sources

fermipy.diffuse.source_factory.make_catalog_sources(catalog_roi_model, source_names)[source]

Construct and return dictionary of sources that are a subset of sources in catalog_roi_model.

Parameters:
  • catalog_roi_model (dict or fermipy.roi_model.ROIModel) – Input set of sources
  • source_names (list) – Names of sourcs to extract
  • dict mapping source_name to fermipy.roi_model.Source object (Returns) –
fermipy.diffuse.source_factory.make_composite_source(name, spectrum)[source]

Construct and return a fermipy.roi_model.CompositeSource object

fermipy.diffuse.source_factory.make_isotropic_source(name, Spectrum_Filename, spectrum)[source]

Construct and return a fermipy.roi_model.IsoSource object

fermipy.diffuse.source_factory.make_mapcube_source(name, Spatial_Filename, spectrum)[source]

Construct and return a fermipy.roi_model.MapCubeSource object

fermipy.diffuse.source_factory.make_point_source(name, src_dict)[source]

Construct and return a fermipy.roi_model.Source object

fermipy.diffuse.source_factory.make_sources(comp_key, comp_dict)[source]

Make dictionary mapping component keys to a source or set of sources

Parameters:
  • comp_key (str) – Key used to access sources
  • comp_dict (dict) – Information used to build sources
  • OrderedDict maping comp_key to fermipy.roi_model.Source (return) –
fermipy.diffuse.source_factory.make_spatialmap_source(name, Spatial_Filename, spectrum)[source]

Construct and return a fermipy.roi_model.Source object

Prepare data for diffuse all-sky analysis

fermipy.diffuse.utils.create_inputlist(arglist)[source]

Read lines from a file and makes a list of file names.

Removes whitespace and lines that start with ‘#’ Recursively read all files with the extension ‘.lst’

fermipy.diffuse.utils.readlines(arg)[source]

Read lines from a file into a list.

Removes whitespace and lines that start with ‘#’

Helper classes to manage model building

class fermipy.diffuse.model_component.ModelComponentInfo(**kwargs)[source]

Bases: object

Information about a model component

Parameters:
  • source_name (str) – The name given to the component, e.g., loop_I or moon
  • source_ver (str) – Key to indentify the model version of the source, e.g., v00
  • sourcekey (str) – Key that identifies this component, e.g., loop_I_v00 or moon_v00
  • model_type (str) – Type of model, ‘MapCubeSource’ | ‘IsoSource’ | ‘CompositeSource’ | ‘Catalog’ | ‘PointSource’
  • srcmdl_name (str) – Name of the xml file with the xml just for this component
  • moving (bool) – Flag for moving sources (i.e., the sun and moon)
  • selection_dependent (bool) – Flag for selection dependent sources (i.e., the residual cosmic ray model)
  • no_psf (bool) – Flag to indicate that we do not smear this component with the PSF
  • components (dict) – Sub-dictionary of ModelComponentInfo objects for moving and selection_dependent sources
  • comp_key (str) – Component key for this component of moving and selection_dependent sources
add_component_info(compinfo)[source]

Add sub-component specific information to a particular data selection

Parameters:compinfo (ModelComponentInfo object) – Sub-component being added
clone_and_merge_sub(key)[source]

Clones self and merges clone with sub-component specific information

Parameters:
  • key (str) – Key specifying which sub-component
  • ModelComponentInfo object (Returns) –
get_component_info(comp)[source]

Return the information about sub-component specific to a particular data selection

Parameters:
  • comp (binning.Component object) – Specifies the sub-component
  • ModelComponentInfo object (Returns) –
update(**kwargs)[source]

Update data members from keyword arguments

class fermipy.diffuse.model_component.CatalogInfo(**kwargs)[source]

Bases: object

Information about a source catalog

Parameters:
  • catalog_name (str) – The name given to the merged component, e.g., merged_CO or merged_HI
  • catalog_file (str) – Fits file with catalog data
  • catalog_extdir (str) – Directory with extended source templates
  • catalog_type (str) – Identifies the format of the catalog fits file: e.g., ‘3FGL’ or ‘4FGLP’
  • catalog (fermipy.catalog.Catalog) – Catalog object
  • roi_model (fermipy.roi_model.ROIModel) – Fermipy object describing all the catalog sources
  • srcmdl_name (str) – Name of xml file with the catalog source model
update(**kwargs)[source]

Update data members from keyword arguments

class fermipy.diffuse.model_component.GalpropMergedRingInfo(**kwargs)[source]

Bases: object

Information about a set of Merged Galprop Rings

Parameters:
  • source_name (str) – The name given to the merged component, e.g., merged_CO or merged_HI
  • ring (int) – The index of the merged ring
  • sourcekey (str) – Key that identifies this component, e.g., merged_CO_1, or merged_HI_3
  • galkey (str) – Key that identifies how to merge the galprop rings, e.g., ‘ref’
  • galprop_run (str) – Key that idenfifies the galprop run used to make the input rings
  • files (str) – List of files of the input gasmap files
  • merged_gasmap (str) – Filename for the merged gasmap
update(**kwargs)[source]

Update data members from keyword arguments

class fermipy.diffuse.model_component.ModelComponentInfo(**kwargs)[source]

Bases: object

Information about a model component

Parameters:
  • source_name (str) – The name given to the component, e.g., loop_I or moon
  • source_ver (str) – Key to indentify the model version of the source, e.g., v00
  • sourcekey (str) – Key that identifies this component, e.g., loop_I_v00 or moon_v00
  • model_type (str) – Type of model, ‘MapCubeSource’ | ‘IsoSource’ | ‘CompositeSource’ | ‘Catalog’ | ‘PointSource’
  • srcmdl_name (str) – Name of the xml file with the xml just for this component
  • moving (bool) – Flag for moving sources (i.e., the sun and moon)
  • selection_dependent (bool) – Flag for selection dependent sources (i.e., the residual cosmic ray model)
  • no_psf (bool) – Flag to indicate that we do not smear this component with the PSF
  • components (dict) – Sub-dictionary of ModelComponentInfo objects for moving and selection_dependent sources
  • comp_key (str) – Component key for this component of moving and selection_dependent sources
add_component_info(compinfo)[source]

Add sub-component specific information to a particular data selection

Parameters:compinfo (ModelComponentInfo object) – Sub-component being added
clone_and_merge_sub(key)[source]

Clones self and merges clone with sub-component specific information

Parameters:
  • key (str) – Key specifying which sub-component
  • ModelComponentInfo object (Returns) –
get_component_info(comp)[source]

Return the information about sub-component specific to a particular data selection

Parameters:
  • comp (binning.Component object) – Specifies the sub-component
  • ModelComponentInfo object (Returns) –
update(**kwargs)[source]

Update data members from keyword arguments

class fermipy.diffuse.model_component.IsoComponentInfo(**kwargs)[source]

Bases: fermipy.diffuse.model_component.ModelComponentInfo

Information about a model component represented by a IsoSource

Parameters:Spectral_Filename (str) – Name of the template file for the spatial model
class fermipy.diffuse.model_component.PointSourceInfo(**kwargs)[source]

Bases: fermipy.diffuse.model_component.ModelComponentInfo

Information about a model component represented by a PointSource

class fermipy.diffuse.model_component.CompositeSourceInfo(**kwargs)[source]

Bases: fermipy.diffuse.model_component.ModelComponentInfo

Information about a model component represented by a CompositeSource

Parameters:
  • source_names (list) – The names of the nested sources
  • catalog_info (model_component.CatalogInfo or None) – Information about the catalog containing the nested sources
  • roi_model (fermipy.roi_model.ROIModel) – Fermipy object describing the nested sources
class fermipy.diffuse.model_component.CatalogSourcesInfo(**kwargs)[source]

Bases: fermipy.diffuse.model_component.ModelComponentInfo

Information about a model component consisting of sources from a catalog

Parameters:
  • source_names (list) – The names of the nested sources
  • catalog_info (model_component.CatalogInfo or None) – Information about the catalog containing the nested sources
  • roi_model (fermipy.roi_model.ROIModel) – Fermipy object describing the nested sources
class fermipy.diffuse.diffuse_src_manager.GalpropMapManager(**kwargs)[source]

Bases: object

Small helper class to keep track of Galprop gasmaps

This keeps track of two types of dictionaries. Both are keyed by: key = {source_name}_{ring}_{galkey}

Where: {source_name} is something like ‘merged_C0’ {ring} is the ring index {galkey} is a key specifying which version of galprop rings to use.

The two dictionaries are: ring_dict[key] = model_component.GalpropMergedRingInfo diffuse_comp_info_dict[key] ] model_component.ModelComponentInfo

The dictionaries are defined in files called. models/galprop_rings_{galkey}.yaml

diffuse_comp_info_dicts(galkey)[source]

Return the components info dictionary for a particular galprop key

galkeys()[source]

Return the list of galprop keys used

make_diffuse_comp_info(merged_name, galkey)[source]

Make the information about a single merged component

Parameters:
  • merged_name (str) – The name of the merged component
  • galkey (str) – A short key identifying the galprop parameters
  • Model_component.ModelComponentInfo (Returns) –
make_diffuse_comp_info_dict(galkey)[source]

Make a dictionary maping from merged component to information about that component

Parameters:galkey (str) – A short key identifying the galprop parameters
make_merged_name(source_name, galkey, fullpath)[source]

Make the name of a gasmap file for a set of merged rings

Parameters:
  • source_name (str) – The galprop component, used to define path to gasmap files
  • galkey (str) – A short key identifying the galprop parameters
  • fullpath (bool) – Return the full path name
make_ring_dict(galkey)[source]

Make a dictionary mapping the merged component names to list of template files

Parameters:
  • galkey (str) – Unique key for this ring dictionary
  • model_component.GalpropMergedRingInfo (Returns) –
make_ring_filelist(sourcekeys, rings, galprop_run)[source]

Make a list of all the template files for a merged component

Parameters:
  • sourcekeys (list-like of str) – The names of the componenents to merge
  • rings (list-like of int) – The indices of the rings to merge
  • galprop_run (str) – String identifying the galprop parameters
make_ring_filename(source_name, ring, galprop_run)[source]

Make the name of a gasmap file for a single ring

Parameters:
  • source_name (str) – The galprop component, used to define path to gasmap files
  • ring (int) – The ring index
  • galprop_run (str) – String identifying the galprop parameters
make_xml_name(source_name, galkey, fullpath)[source]

Make the name of an xml file for a model definition for a set of merged rings

Parameters:
  • source_name (str) – The galprop component, used to define path to gasmap files
  • galkey (str) – A short key identifying the galprop parameters
  • fullpath (bool) – Return the full path name
merged_components(galkey)[source]

Return the set of merged components for a particular galprop key

read_galprop_rings_yaml(galkey)[source]

Read the yaml file for a partiuclar galprop key

ring_dict(galkey)[source]

Return the ring dictionary for a particular galprop key

class fermipy.diffuse.diffuse_src_manager.DiffuseModelManager(**kwargs)[source]

Bases: object

Small helper class to keep track of diffuse component templates

This keeps track of the ‘diffuse component infomation’ dictionary

This keyed by: key = {source_name}_{source_ver} Where: {source_name} is something like ‘loopI’ {source_ver} is somthinng like v00

The dictioary is diffuse_comp_info_dict[key] - > model_component.ModelComponentInfo

Note that some components ( those that represent moving sources or are selection depedent ) will have a sub-dictionary of diffuse_comp_info_dict object for each sub-component

The compoents are defined in a file called config/diffuse_components.yaml

diffuse_comp_info(sourcekey)[source]

Return the Component info associated to a particular key

make_diffuse_comp_info(source_name, source_ver, diffuse_dict, components=None, comp_key=None)[source]

Make a dictionary mapping the merged component names to list of template files

Parameters:
  • source_name (str) – Name of the source
  • source_ver (str) – Key identifying the version of the source
  • diffuse_dict (dict) – Information about this component
  • comp_key (str) – Used when we need to keep track of sub-components, i.e., for moving and selection dependent sources.
  • model_component.ModelComponentInfo or (Returns) –
  • model_component.IsoComponentInfo
make_diffuse_comp_info_dict(diffuse_sources, components)[source]

Make a dictionary maping from diffuse component to information about that component

Parameters:
  • diffuse_sources (dict) – Dictionary with diffuse source defintions
  • components (dict) – Dictionary with event selection defintions, needed for selection depenedent diffuse components
Returns:

ret_dict – Dictionary mapping sourcekey to model_component.ModelComponentInfo

Return type:

dict

make_template_name(model_type, sourcekey)[source]

Make the name of a template file for particular component

Parameters:
  • model_type (str) – Type of model to use for this component
  • sourcekey (str) – Key to identify this component
  • filename or None if component does not require a template file (Returns) –
make_xml_name(sourcekey)[source]

Make the name of an xml file for a model definition of a single component

Parameters:sourcekey (str) – Key to identify this component
static read_diffuse_component_yaml(yamlfile)[source]

Read the yaml file for the diffuse components

sourcekeys()[source]

Return the list of source keys

class fermipy.diffuse.catalog_src_manager.CatalogSourceManager(**kwargs)[source]

Bases: object

Small helper class to keep track of how we deal with catalog sources

This keeps track of two dictionaries

One of the dictionaries is keyed by catalog name, and contains information about complete catalogs catalog_comp_info_dicts[catalog_name] : model_component.CatalogInfo

The other dictionary is keyed by [{catalog_name}_{split_ver}][{split_key}] Where: {catalog_name} is something like ‘3FGL’ {split_ver} is somthing like ‘v00’ and specifes how to divide sources in the catalog {split_key} refers to a specific sub-selection of sources

split_comp_info_dicts[splitkey] : model_component.ModelComponentInfo

build_catalog_info(catalog_info)[source]

Build a CatalogInfo object

catalog_comp_info_dict(catkey)[source]

Return the roi_model for an entire catalog

catalog_components(catalog_name, split_ver)[source]

Return the set of merged components for a particular split key

catalogs()[source]

Return the list of full catalogs used

make_catalog_comp_info(full_cat_info, split_key, rule_key, rule_val, sources)[source]

Make the information about a single merged component

Parameters:
  • full_cat_info (_model_component.CatalogInfo) – Information about the full catalog
  • split_key (str) – Key identifying the version of the spliting used
  • rule_key (str) – Key identifying the specific rule for this component
  • rule_val (list) – List of the cuts used to define this component
  • sources (list) – List of the names of the sources in this component
  • CompositeSourceInfo or CatalogSourcesInfo (Returns) –
make_catalog_comp_info_dict(catalog_sources)[source]

Make the information about the catalog components

Parameters:catalog_sources (dict) – Dictionary with catalog source defintions
Returns:
  • catalog_ret_dict (dict) – Dictionary mapping catalog_name to model_component.CatalogInfo
  • split_ret_dict (dict) – Dictionary mapping sourcekey to model_component.ModelComponentInfo
read_catalog_info_yaml(splitkey)[source]

Read the yaml file for a particular split key

split_comp_info(catalog_name, split_ver, split_key)[source]

Return the info for a particular split key

split_comp_info_dict(catalog_name, split_ver)[source]

Return the information about a particular scheme for how to handle catalog sources

splitkeys()[source]

Return the list of catalog split keys used

class fermipy.diffuse.model_manager.ModelComponent(**kwargs)[source]

Bases: object

Small helper class to tie a ModelComponentInfo to a spectrum

class fermipy.diffuse.model_manager.ModelInfo(**kwargs)[source]

Bases: object

Small helper class to keep track of a single fitting model

component_names

Return the list of name of the components

edisp_disable_list()[source]

Return the list of source for which energy dispersion should be turned off

items()[source]

Return the key, value pairs of model components

make_model_rois(components, name_factory)[source]

Make the fermipy roi_model objects for each of a set of binning components

make_srcmap_manifest(components, name_factory)[source]

Build a yaml file that specfies how to make the srcmap files for a particular model

Parameters:
  • components (list) – The binning components used in this analysis
  • name_factory (NameFactory) – Object that handles naming conventions
  • a dictionary that contains information about where to find the (Returns) –
  • maps for each component of the model (source) –
class fermipy.diffuse.model_manager.ModelManager(**kwargs)[source]

Bases: object

Small helper class to create fitting models and manager XML files for fermipy

This class contains a ‘library’, which is a dictionary of all the source components:

specifically it maps:

sourcekey : model_component.ModelComponentInfo

csm

Return the CatalogSourceManager

dmm

Return the DiffuseModelManager

static get_sub_comp_info(source_info, comp)[source]

Build and return information about a sub-component for a particular selection

gmm

Return the GalpropMapManager

make_fermipy_config_yaml(modelkey, components, data, **kwargs)[source]

Build a fermipy top-level yaml configuration file

Parameters:
  • modelkey (str) – Key used to identify this particular model
  • components (list) – The binning components used in this analysis
  • data (str) – Path to file containing dataset definition
make_library(diffuse_yaml, catalog_yaml, binning_yaml)[source]

Build up the library of all the components

Parameters:
  • diffuse_yaml (str) – Name of the yaml file with the library of diffuse component definitions
  • catalog_yaml (str) – Name of the yaml file width the library of catalog split definitions
  • binning_yaml (str) – Name of the yaml file with the binning definitions
make_model_info(modelkey)[source]

Build a dictionary with the information for a particular model.

Parameters:
  • modelkey (str) – Key used to identify this particular model
  • ModelInfo (Return) –
make_srcmap_manifest(modelkey, components, data)[source]

Build a yaml file that specfies how to make the srcmap files for a particular model

Parameters:
  • modelkey (str) – Key used to identify this particular model
  • components (list) – The binning components used in this analysis
  • data (str) – Path to file containing dataset definition
read_model_yaml(modelkey)[source]

Read the yaml file for the diffuse components

Batch job dispatch classes

class fermipy.diffuse.job_library.Gtexpcube2_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for Gtlink_expcube2

Parameters:
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • hpx_order_max (<class 'int'>) – Maximum HEALPIX order for exposure cubes. [6]
appname = 'fermipy-gtexcube2-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of Gtlink_expcube2

default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'hpx_order_max': (6, 'Maximum HEALPIX order for exposure cubes.', <class 'int'>)}
description = 'Submit gtexpube2 jobs in parallel'
job_time = 300
usage = 'fermipy-gtexcube2-sg [options]'
class fermipy.diffuse.job_library.Gtltsum_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for Gtlink_ltsum

Parameters:
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • ft1file (<class 'str'>) – Input FT1 file [None]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
appname = 'fermipy-gtltsum-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of Gtlink_ltsum

default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'ft1file': (None, 'Input FT1 file', <class 'str'>)}
description = 'Submit gtltsum jobs in parallel'
job_time = 300
usage = 'fermipy-gtltsum-sg [options]'
class fermipy.diffuse.solar.Gtexpcube2wcs_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for gtexphpsun

Parameters:
  • mktimefilter (<class 'str'>) – Key for gtmktime selection [None]
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • binsz (<class 'float'>) – Image scale (in degrees/pixel) [1.0]
  • nypix (<class 'int'>) – Size of the Y axis in pixels [180]
  • nxpix (<class 'int'>) – Size of the X axis in pixels [360]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
appname = 'fermipy-gtexpcube2wcs-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of Gtlink_expcube2_wcs

default_options = {'binsz': (1.0, 'Image scale (in degrees/pixel)', <class 'float'>), 'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'mktimefilter': (None, 'Key for gtmktime selection', <class 'str'>), 'nxpix': (360, 'Size of the X axis in pixels', <class 'int'>), 'nypix': (180, 'Size of the Y axis in pixels', <class 'int'>)}
description = 'Submit gtexpcube2 jobs in parallel'
job_time = 300
usage = 'fermipy-gtexpcube2wcs-sg [options]'
class fermipy.diffuse.solar.Gtexphpsun_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for gtexphpsun

Parameters:
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • mktimefilter (<class 'str'>) – Key for gtmktime selection [None]
appname = 'fermipy-gtexphpsun-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of Gtlink_exphpsun

default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'mktimefilter': (None, 'Key for gtmktime selection', <class 'str'>)}
description = 'Submit gtexphpsun jobs in parallel'
job_time = 300
usage = 'fermipy-gtexphpsun-sg [options]'
class fermipy.diffuse.solar.Gtsuntemp_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for gtsuntemp

Parameters:
  • sourcekeys (<class 'list'>) – Keys for sources to make template for [None]
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • mktimefilter (<class 'str'>) – Key for gtmktime selection [None]
appname = 'fermipy-gtsuntemp-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of Gtlink_suntemp

default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'mktimefilter': (None, 'Key for gtmktime selection', <class 'str'>), 'sourcekeys': (None, 'Keys for sources to make template for', <class 'list'>)}
description = 'Submit gtsuntemp jobs in parallel'
job_time = 300
usage = 'fermipy-gtsuntemp-sg [options]'
class fermipy.diffuse.gt_coadd_split.CoaddSplit_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for fermipy-coadd

Parameters:
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • ft1file (<class 'str'>) – Input FT1 file [None]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
appname = 'fermipy-coadd-split-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of fermipy.diffuse.job_library.Link_FermipyCoadd

default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'ft1file': (None, 'Input FT1 file', <class 'str'>)}
description = 'Submit fermipy-coadd-split- jobs in parallel'
job_time = 300
usage = 'fermipy-coadd-split-sg [options]'
class fermipy.diffuse.job_library.GatherSrcmaps_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for Link_FermipyGatherSrcmaps

Parameters:
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • library (<class 'str'>) – Path to yaml file defining model components. [models/library.yaml]
appname = 'fermipy-gather-srcmaps-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of Link_FermipyGatherSrcmaps

default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'library': ('models/library.yaml', 'Path to yaml file defining model components.', <class 'str'>)}
description = 'Submit fermipy-gather-srcmaps jobs in parallel'
job_time = 300
usage = 'fermipy-gather-srcmaps-sg [options]'
class fermipy.diffuse.job_library.Vstack_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for Link_FermipyVstack
to merge source maps
Parameters:
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • library (<class 'str'>) – Path to yaml file defining model components. [models/library.yaml]
appname = 'fermipy-vstack-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of Link_FermipyVstack

default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'library': ('models/library.yaml', 'Path to yaml file defining model components.', <class 'str'>)}
description = 'Submit fermipy-vstack jobs in parallel'
job_time = 300
usage = 'fermipy-vstack-sg [options]'
class fermipy.diffuse.job_library.Healview_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for Link_FermipyHealview

Parameters:
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • library (<class 'str'>) – Path to yaml file defining model components. [models/library.yaml]
appname = 'fermipy-healviw-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of Link_FermipyHealview

default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'library': ('models/library.yaml', 'Path to yaml file defining model components.', <class 'str'>)}
description = 'Submit fermipy-healviw jobs in parallel'
job_time = 60
usage = 'fermipy-healviw-sg [options]'
class fermipy.diffuse.job_library.SumRings_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for Link_FermipyCoadd
to sum galprop ring gasmaps
Parameters:
  • library (<class 'str'>) – Path to yaml file defining model components. [models/library.yaml]
  • outdir (<class 'str'>) – Output directory [None]
appname = 'fermipy-sum-rings-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of Link_FermipyCoadd

default_options = {'library': ('models/library.yaml', 'Path to yaml file defining model components.', <class 'str'>), 'outdir': (None, 'Output directory', <class 'str'>)}
description = 'Submit fermipy-coadd jobs in parallel to sum GALProp rings'
job_time = 300
usage = 'fermipy-sum-rings-sg [options]'
class fermipy.diffuse.job_library.SumRings_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for Link_FermipyCoadd
to sum galprop ring gasmaps
Parameters:
  • library (<class 'str'>) – Path to yaml file defining model components. [models/library.yaml]
  • outdir (<class 'str'>) – Output directory [None]
appname = 'fermipy-sum-rings-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of Link_FermipyCoadd

default_options = {'library': ('models/library.yaml', 'Path to yaml file defining model components.', <class 'str'>), 'outdir': (None, 'Output directory', <class 'str'>)}
description = 'Submit fermipy-coadd jobs in parallel to sum GALProp rings'
job_time = 300
usage = 'fermipy-sum-rings-sg [options]'
class fermipy.diffuse.residual_cr.ResidualCR_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for this script

Parameters:
  • hpx_order (<class 'int'>) – HEALPIX order parameter [6]
  • clean (<class 'str'>) – Clean event class [ultracleanveto]
  • dirty (<class 'str'>) – Dirty event class [source]
  • full_output (<class 'bool'>) – Include diagnostic output [False]
  • mktimefilter (<class 'str'>) – Key for gtmktime selection [None]
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • select_factor (<class 'float'>) – Pixel selection factor for Aeff Correction [5.0]
  • mask_factor (<class 'float'>) – Pixel selection factor for output mask [2.0]
  • sigma (<class 'float'>) – Width of gaussian to smooth output maps [degrees] [3.0]
appname = 'fermipy-residual-cr-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of ResidualCR

default_options = {'clean': ('ultracleanveto', 'Clean event class', <class 'str'>), 'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'dirty': ('source', 'Dirty event class', <class 'str'>), 'full_output': (False, 'Include diagnostic output', <class 'bool'>), 'hpx_order': (6, 'HEALPIX order parameter', <class 'int'>), 'mask_factor': (2.0, 'Pixel selection factor for output mask', <class 'float'>), 'mktimefilter': (None, 'Key for gtmktime selection', <class 'str'>), 'select_factor': (5.0, 'Pixel selection factor for Aeff Correction', <class 'float'>), 'sigma': (3.0, 'Width of gaussian to smooth output maps [degrees]', <class 'float'>)}
description = 'Compute the residual cosmic-ray contamination'
job_time = 300
usage = 'fermipy-residual-cr-sg [options]'
class fermipy.diffuse.gt_merge_srcmaps.MergeSrcmaps_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for GtMergeSrcmaps

Parameters:
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • library (<class 'str'>) – Path to yaml file defining model components. [models/library.yaml]
appname = 'fermipy-merge-srcmaps-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of GtMergeSrcmaps

default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'library': ('models/library.yaml', 'Path to yaml file defining model components.', <class 'str'>)}
description = 'Merge diffuse maps for all-sky analysis'
job_time = 300
usage = 'fermipy-merge-srcmaps-sg [options]'
class fermipy.diffuse.gt_srcmap_partial.SrcmapsDiffuse_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for GtSrcmapsDiffuse

Parameters:
  • make_xml (<class 'bool'>) – Write xml files needed to make source maps [True]
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • library (<class 'str'>) – Path to yaml file defining model components. [models/library.yaml]
appname = 'fermipy-srcmaps-diffuse-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of GtSrcmapsDiffuse

default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'library': ('models/library.yaml', 'Path to yaml file defining model components.', <class 'str'>), 'make_xml': (True, 'Write xml files needed to make source maps', <class 'bool'>)}
description = 'Run gtsrcmaps for diffuse sources'
job_time = 1500
usage = 'fermipy-srcmaps-diffuse-sg [options]'
class fermipy.diffuse.gt_srcmaps_catalog.SrcmapsCatalog_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for gtsrcmaps for catalog sources

Parameters:
  • make_xml (<class 'bool'>) – Make XML files. [True]
  • nsrc (<class 'int'>) – Number of sources per job [500]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • library (<class 'str'>) – Path to yaml file defining model components. [models/library.yaml]
appname = 'fermipy-srcmaps-catalog-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of GtSrcmapsCatalog

default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'library': ('models/library.yaml', 'Path to yaml file defining model components.', <class 'str'>), 'make_xml': (True, 'Make XML files.', <class 'bool'>), 'nsrc': (500, 'Number of sources per job', <class 'int'>)}
description = 'Run gtsrcmaps for catalog sources'
job_time = 1500
usage = 'fermipy-srcmaps-catalog-sg [options]'
class fermipy.diffuse.gt_assemble_model.AssembleModel_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for this script

Parameters:
  • --compname (binning component definition yaml file) –
  • --data (datset definition yaml file) –
  • --models (model definitino yaml file) –
  • args (Names of models to assemble source maps for) –
appname = 'fermipy-assemble-model-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of AssembleModel

default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'hpx_order': (7, 'Maximum HEALPIX order for model fitting.', <class 'int'>), 'models': ('models/modellist.yaml', 'Path to yaml file defining models.', <class 'str'>)}
description = 'Copy source maps from the library to a analysis directory'
job_time = 300
usage = 'fermipy-assemble-model-sg [options]'
class fermipy.diffuse.residual_cr.ResidualCR_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for this script

Parameters:
  • hpx_order (<class 'int'>) – HEALPIX order parameter [6]
  • clean (<class 'str'>) – Clean event class [ultracleanveto]
  • dirty (<class 'str'>) – Dirty event class [source]
  • full_output (<class 'bool'>) – Include diagnostic output [False]
  • mktimefilter (<class 'str'>) – Key for gtmktime selection [None]
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • select_factor (<class 'float'>) – Pixel selection factor for Aeff Correction [5.0]
  • mask_factor (<class 'float'>) – Pixel selection factor for output mask [2.0]
  • sigma (<class 'float'>) – Width of gaussian to smooth output maps [degrees] [3.0]
appname = 'fermipy-residual-cr-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of ResidualCR

default_options = {'clean': ('ultracleanveto', 'Clean event class', <class 'str'>), 'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'dirty': ('source', 'Dirty event class', <class 'str'>), 'full_output': (False, 'Include diagnostic output', <class 'bool'>), 'hpx_order': (6, 'HEALPIX order parameter', <class 'int'>), 'mask_factor': (2.0, 'Pixel selection factor for output mask', <class 'float'>), 'mktimefilter': (None, 'Key for gtmktime selection', <class 'str'>), 'select_factor': (5.0, 'Pixel selection factor for Aeff Correction', <class 'float'>), 'sigma': (3.0, 'Width of gaussian to smooth output maps [degrees]', <class 'float'>)}
description = 'Compute the residual cosmic-ray contamination'
job_time = 300
usage = 'fermipy-residual-cr-sg [options]'
class fermipy.diffuse.gt_split_and_bin.SplitAndBin_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for SplitAndBin

Parameters:
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • ft1file (<class 'str'>) – Input FT1 file [None]
  • scratch (<class 'str'>) – Path to scratch area [None]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • hpx_order_max (<class 'int'>) – Maximum HEALPIX order for binning counts data. [9]
appname = 'fermipy-split-and-bin-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of SplitAndBin

default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'ft1file': (None, 'Input FT1 file', <class 'str'>), 'hpx_order_max': (9, 'Maximum HEALPIX order for binning counts data.', <class 'int'>), 'scratch': (None, 'Path to scratch area', <class 'str'>)}
description = 'Prepare data for diffuse all-sky analysis'
job_time = 1500
usage = 'fermipy-split-and-bin-sg [options]'
class fermipy.diffuse.gt_split_and_mktime.SplitAndMktime_SG(link, **kwargs)[source]

Bases: fermipy.jobs.scatter_gather.ScatterGather

Small class to generate configurations for SplitAndMktime

Parameters:
  • do_ltsum (<class 'bool'>) – Run gtltsum on inputs [False]
  • ft1file (<class 'str'>) – Path to list of input FT1 files [P8_P305_8years_source_zmax105.lst]
  • hpx_order_max (<class 'int'>) – Maximum HEALPIX order for binning counts data. [9]
  • ft2file (<class 'str'>) – Path to list of input FT2 files [ft2_8years.lst]
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • dry_run (<class 'bool'>) – Print commands but do not run them [False]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • scratch (<class 'str'>) – Path to scratch area. [None]
appname = 'fermipy-split-and-mktime-sg'
build_job_configs(args)[source]

Hook to build job configurations

clientclass

alias of SplitAndMktime

default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'do_ltsum': (False, 'Run gtltsum on inputs', <class 'bool'>), 'dry_run': (False, 'Print commands but do not run them', <class 'bool'>), 'ft1file': ('P8_P305_8years_source_zmax105.lst', 'Path to list of input FT1 files', <class 'str'>), 'ft2file': ('ft2_8years.lst', 'Path to list of input FT2 files', <class 'str'>), 'hpx_order_max': (9, 'Maximum HEALPIX order for binning counts data.', <class 'int'>), 'scratch': (None, 'Path to scratch area.', <class 'str'>)}
description = 'Prepare data for diffuse all-sky analysis'
job_time = 1500
usage = 'fermipy-split-and-mktime-sg [options]'

Analysis chain classes

class fermipy.diffuse.gt_coadd_split.CoaddSplit(**kwargs)[source]

Bases: fermipy.jobs.chain.Chain

Small class to merge counts cubes for a series of binning components

This chain consists multiple Link objects:

coadd-EBIN-ZCUT-FILTER-EVTYPE : _Link_FermipyCoadd
Link to coadd data of a particular type.
Parameters:
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • dry_run (<class 'bool'>) – Print commands but do not run them [False]
  • nfiles (<class 'int'>) – Number of input files [96]
  • do_ltsum (<class 'bool'>) – Sum livetime cube files [False]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
appname = 'fermipy-coadd-split'
default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'do_ltsum': (False, 'Sum livetime cube files', <class 'bool'>), 'dry_run': (False, 'Print commands but do not run them', <class 'bool'>), 'nfiles': (96, 'Number of input files', <class 'int'>)}
description = 'Merge a set of counts cube files'
linkname_default = 'coadd-split'
usage = 'fermipy-coadd-split [options]'
class fermipy.diffuse.gt_split_and_bin.SplitAndBin(**kwargs)[source]

Bases: fermipy.jobs.chain.Chain

Small class to split and bin data according to some user-provided specification

This chain consists multiple Link objects:

select-energy-EBIN-ZCUT : Gtlink_select
Initial splitting by energy bin and zenith angle cut
select-type-EBIN-ZCUT-FILTER-TYPE : Gtlink_select
Refinement of selection from event types
bin-EBIN-ZCUT-FILTER-TYPE : Gtlink_bin
Final binning of the data for each event type
Parameters:
  • dry_run (<class 'bool'>) – Print commands but do not run them [False]
  • outkey (<class 'str'>) – Key for this particular output file [None]
  • hpx_order_max (<class 'int'>) – Maximum HEALPIX order for binning counts data. [9]
  • ft1file (<class 'str'>) – Input FT1 file [None]
  • pfiles (<class 'str'>) – Directory for .par files [None]
  • evclass (<class 'int'>) – Event class bit mask [128]
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • scratch (<class 'str'>) – Scratch area [None]
  • outdir (<class 'str'>) – Base name for output files [counts_cubes_cr]
appname = 'fermipy-split-and-bin'
default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'dry_run': (False, 'Print commands but do not run them', <class 'bool'>), 'evclass': (128, 'Event class bit mask', <class 'int'>), 'ft1file': (None, 'Input FT1 file', <class 'str'>), 'hpx_order_max': (9, 'Maximum HEALPIX order for binning counts data.', <class 'int'>), 'outdir': ('counts_cubes_cr', 'Base name for output files', <class 'str'>), 'outkey': (None, 'Key for this particular output file', <class 'str'>), 'pfiles': (None, 'Directory for .par files', <class 'str'>), 'scratch': (None, 'Scratch area', <class 'str'>)}
description = 'Run gtselect and gtbin together'
linkname_default = 'split-and-bin'
usage = 'fermipy-split-and-bin [options]'
class fermipy.diffuse.gt_split_and_bin.SplitAndBinChain(**kwargs)[source]

Bases: fermipy.jobs.chain.Chain

Chain to run split and bin and then make exposure cubes

This chain consists of:

split-and-bin : SplitAndBin_SG
Chain to make the binned counts maps for each input file
coadd-split : CoaddSplit_SG
Link to co-add the binnec counts maps files
expcube2 : Gtexpcube2_SG
Link to make the corresponding binned exposure maps
Parameters:
  • hpx_order_ccube (<class 'int'>) – Maximum HEALPIX order for binning counts data. [9]
  • hpx_order_expcube (<class 'int'>) – Maximum HEALPIX order for exposure cubes. [6]
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • dry_run (<class 'bool'>) – Print commands but do not run them [False]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • ft1file (<class 'str'>) – Path to list of input FT1 files [P8_P305_8years_source_zmax105.lst]
  • scratch (<class 'str'>) – Path to scratch area. [None]
appname = 'fermipy-split-and-bin-chain'
default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'dry_run': (False, 'Print commands but do not run them', <class 'bool'>), 'ft1file': ('P8_P305_8years_source_zmax105.lst', 'Path to list of input FT1 files', <class 'str'>), 'hpx_order_ccube': (9, 'Maximum HEALPIX order for binning counts data.', <class 'int'>), 'hpx_order_expcube': (6, 'Maximum HEALPIX order for exposure cubes.', <class 'int'>), 'scratch': (None, 'Path to scratch area.', <class 'str'>)}
description = 'Run split-and-bin, coadd-split and exposure'
linkname_default = 'split-and-bin-chain'
usage = 'fermipy-split-and-bin-chain [options]'
class fermipy.diffuse.gt_split_and_mktime.SplitAndMktime(**kwargs)[source]

Bases: fermipy.jobs.chain.Chain

Small class to split, apply mktime and bin data according to some user-provided specification

This chain consists multiple Link objects:

select-energy-EBIN-ZCUT : Gtlink_select
Initial splitting by energy bin and zenith angle cut
mktime-EBIN-ZCUT-FILTER : Gtlink_mktime
Application of gtmktime filter for zenith angle cut
ltcube-EBIN-ZCUT-FILTER : Gtlink_ltcube
Computation of livetime cube for zenith angle cut
select-type-EBIN-ZCUT-FILTER-TYPE : Gtlink_select
Refinement of selection from event types
bin-EBIN-ZCUT-FILTER-TYPE : Gtlink_bin
Final binning of the data for each event type
Parameters:
  • do_ltsum (<class 'bool'>) – Sum livetime cube files [False]
  • outkey (<class 'str'>) – Key for this particular output file [None]
  • hpx_order_max (<class 'int'>) – Maximum HEALPIX order for binning counts data. [9]
  • ft1file (<class 'str'>) – Path to list of input FT1 files [P8_P305_8years_source_zmax105.lst]
  • ft2file (<class 'str'>) – Path to list of input FT2 files [ft2_8years.lst]
  • pfiles (<class 'str'>) – Directory for .par files [None]
  • evclass (<class 'int'>) – Event class bit mask [128]
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • dry_run (<class 'bool'>) – Print commands but do not run them [False]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • scratch (<class 'str'>) – Scratch area [None]
  • outdir (<class 'str'>) – Output directory [counts_cubes]
appname = 'fermipy-split-and-mktime'
default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'do_ltsum': (False, 'Sum livetime cube files', <class 'bool'>), 'dry_run': (False, 'Print commands but do not run them', <class 'bool'>), 'evclass': (128, 'Event class bit mask', <class 'int'>), 'ft1file': ('P8_P305_8years_source_zmax105.lst', 'Path to list of input FT1 files', <class 'str'>), 'ft2file': ('ft2_8years.lst', 'Path to list of input FT2 files', <class 'str'>), 'hpx_order_max': (9, 'Maximum HEALPIX order for binning counts data.', <class 'int'>), 'outdir': ('counts_cubes', 'Output directory', <class 'str'>), 'outkey': (None, 'Key for this particular output file', <class 'str'>), 'pfiles': (None, 'Directory for .par files', <class 'str'>), 'scratch': (None, 'Scratch area', <class 'str'>)}
description = 'Run gtselect and gtbin together'
linkname_default = 'split-and-mktime'
usage = 'fermipy-split-and-mktime [options]'
class fermipy.diffuse.gt_split_and_mktime.SplitAndMktimeChain(**kwargs)[source]

Bases: fermipy.jobs.chain.Chain

Chain to run split and mktime and then make livetime and exposure cubes

This chain consists of:

split-and-mktime : SplitAndMkTime_SG
Chain to make the binned counts maps for each input file
coadd-split : CoaddSplit_SG
Link to co-add the binnec counts maps files
ltsum : Gtltsum_SG
Link to co-add the livetime cube files
expcube2 : Gtexpcube2_SG
Link to make the corresponding binned exposure maps
Parameters:
  • do_ltsum (<class 'bool'>) – Run gtltsum on inputs [False]
  • ft1file (<class 'str'>) – Path to list of input FT1 files [P8_P305_8years_source_zmax105.lst]
  • hpx_order_ccube (<class 'int'>) – Maximum HEALPIX order for binning counts data. [9]
  • ft2file (<class 'str'>) – Path to list of input FT2 files [ft2_8years.lst]
  • hpx_order_expcube (<class 'int'>) – Maximum HEALPIX order for exposure cubes. [6]
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • dry_run (<class 'bool'>) – Print commands but do not run them [False]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • scratch (<class 'str'>) – Path to scratch area. [None]
appname = 'fermipy-split-and-mktime-chain'
default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'do_ltsum': (False, 'Run gtltsum on inputs', <class 'bool'>), 'dry_run': (False, 'Print commands but do not run them', <class 'bool'>), 'ft1file': ('P8_P305_8years_source_zmax105.lst', 'Path to list of input FT1 files', <class 'str'>), 'ft2file': ('ft2_8years.lst', 'Path to list of input FT2 files', <class 'str'>), 'hpx_order_ccube': (9, 'Maximum HEALPIX order for binning counts data.', <class 'int'>), 'hpx_order_expcube': (6, 'Maximum HEALPIX order for exposure cubes.', <class 'int'>), 'scratch': (None, 'Path to scratch area.', <class 'str'>)}
description = 'Run split-and-mktime, coadd-split and exposure'
linkname_default = 'split-and-mktime-chain'
usage = 'fermipy-split-and-mktime-chain [options]'
class fermipy.diffuse.diffuse_analysis.DiffuseCompChain(**kwargs)[source]

Bases: fermipy.jobs.chain.Chain

Chain to build srcmaps for diffuse components

This chain consists of:

sum-rings : SumRings_SG
Merge GALProp gas maps by type and ring
srcmaps-diffuse : SrcmapsDiffuse_SG
Compute diffuse component source maps in parallel
vstack-diffuse : Vstack_SG
Combine diffuse component source maps
Parameters:
  • make_xml (<class 'bool'>) – Make XML files. [True]
  • dry_run (<class 'bool'>) – Print commands but do not run them [False]
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • library (<class 'str'>) – Path to yaml file defining model components. [models/library.yaml]
  • outdir (<class 'str'>) – Output directory [None]
appname = 'fermipy-diffuse-comp-chain'
default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'dry_run': (False, 'Print commands but do not run them', <class 'bool'>), 'library': ('models/library.yaml', 'Path to yaml file defining model components.', <class 'str'>), 'make_xml': (True, 'Make XML files.', <class 'bool'>), 'outdir': (None, 'Output directory', <class 'str'>)}
description = 'Run diffuse component analysis'
linkname_default = 'diffuse-comp'
usage = 'fermipy-diffuse-comp-chain [options]'
class fermipy.diffuse.diffuse_analysis.CatalogCompChain(**kwargs)[source]

Bases: fermipy.jobs.chain.Chain

Small class to build srcmaps for catalog components

This chain consists of:

srcmaps-catalog : SrcmapsCatalog_SG
Build source maps for all catalog sources in parallel
gather-srcmaps : GatherSrcmaps_SG
Gather source maps into
merge-srcmaps : MergeSrcmaps_SG
Compute source maps for merged sources
Parameters:
  • make_xml (<class 'bool'>) – Make XML files for diffuse components [False]
  • nsrc (<class 'int'>) – Number of sources per job [500]
  • dry_run (<class 'bool'>) – Print commands but do not run them [False]
  • data (<class 'str'>) – Path to yaml file defining dataset. [config/dataset_sourceveto.yaml]
  • comp (<class 'str'>) – Path to yaml file defining binning. [config/binning.yaml]
  • library (<class 'str'>) – Path to yaml file defining model components. [models/library.yaml]
appname = 'fermipy-catalog-comp-chain'
default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'dry_run': (False, 'Print commands but do not run them', <class 'bool'>), 'library': ('models/library.yaml', 'Path to yaml file defining model components.', <class 'str'>), 'make_xml': (False, 'Make XML files for diffuse components', <class 'bool'>), 'nsrc': (500, 'Number of sources per job', <class 'int'>)}
description = 'Run catalog component analysis'
linkname_default = 'catalog-comp'
usage = 'fermipy-catalog-comp-chain [options]'
class fermipy.diffuse.gt_assemble_model.AssembleModelChain(**kwargs)[source]

Bases: fermipy.jobs.chain.Chain

Small class to split, apply mktime and bin data according to some user-provided specification

appname = 'fermipy-assemble-model-chain'
default_options = {'comp': ('config/binning.yaml', 'Path to yaml file defining binning.', <class 'str'>), 'data': ('config/dataset_sourceveto.yaml', 'Path to yaml file defining dataset.', <class 'str'>), 'dry_run': (False, 'Print commands but do not run them', <class 'bool'>), 'hpx_order': (7, 'Maximum HEALPIX order for model fitting.', <class 'int'>), 'library': ('models/library.yaml', 'Path to yaml file defining model components.', <class 'str'>), 'models': ('models/modellist.yaml', 'Path to yaml file defining models.', <class 'str'>)}
description = 'Run init-model and assemble-model'
linkname_default = 'assemble-model-chain'
usage = 'fermipy-assemble-model-chain [options]'
class fermipy.diffuse.diffuse_analysis.DiffuseAnalysisChain(**kwargs)[source]

Bases: fermipy.jobs.chain.Chain

Chain to define diffuse all-sky analysis

This chain consists of:

prepare : SplitAndBinChain
Bin the data and make the exposure maps
diffuse-comp : DiffuseCompChain
Make source maps for diffuse components
catalog-comp : CatalogCompChain
Make source maps for catalog components
assemble-model : AssembleModelChain
Assemble the models for fitting
Parameters:
  • dry_run (<class 'bool'>) – Print commands but do not run them [False]
  • config (<class 'str'>) – Config yaml file [None]
appname = 'fermipy-diffuse-analysis'
default_options = {'config': (None, 'Config yaml file', <class 'str'>), 'dry_run': (False, 'Print commands but do not run them', <class 'bool'>)}
description = 'Run diffuse analysis chain'
linkname_default = 'diffuse'
usage = 'fermipy-diffuse-analysis [options]'
class fermipy.diffuse.residual_cr.ResidualCRChain(**kwargs)[source]

Bases: fermipy.jobs.chain.Chain

Chain to preform analysis of residual cosmic-ray contamination

This chain consists of:

split-and-mktime : SplitAndMktimeChain
Chain to bin up the data and make exposure cubes
residual-cr : ResidualCR
Residual CR analysis
Parameters:config (<class 'str'>) – Config yaml file [None]
appname = 'fermipy-residual-cr-chain'
default_options = {'config': (None, 'Config yaml file', <class 'str'>)}
description = 'Run residual cosmic ray analysis'
linkname_default = 'residual-cr-chain'
usage = 'fermipy-residual-cr-chain [options]'
class fermipy.diffuse.solar.SunMoonChain(**kwargs)[source]

Bases: fermipy.jobs.chain.Chain

Chain to construct sun and moon templates

This chain consists of:

exphpsun : Gtexphpsun_SG
Build the sun-centered exposure cubes
suntemp : Gtsuntemp_SG
Build the templates
appname = 'fermipy-sunmoon-chain'
default_options = {'config': (None, 'Config yaml file', <class 'str'>)}
description = 'Run sun and moon template construction'
linkname_default = 'summoon'
usage = 'fermipy-sunmoon-chain [options]'