Commit f5cf0dc8 authored by Daniel Scheffler's avatar Daniel Scheffler
Browse files

third version of wrapper for atmospheric correction (first working version)

algorithms.GEOPROCESSING:
- GEOPROCESSING: conversion_type_optical value renamed from 'Ref' to 'TOA_Ref' and 'BOA_Ref'
- added keyword 'meshwidth' to the following functions (allows much faster processing):
    - zoom_2Darray_to_shapeFullArr()
    - adjust_acquisArrProv_to_shapeFullArr()
    - get_lonlat_coord_array():revised calculation of meshgrid
    - calc_VZA_array()
    - calc_AcqTime_array()
    - calc_SZA_SAA_array()
- calc_RAA_array(): now receives a VAA_array instead of VAA_mean

algorithms.gms_object:
- added attributes 'fullSceneCornerPos' and 'fullSceneCornerPos'
- logger: added assertion
- added property 'log'
- added dem deleter
- revised property 'ac_options'
- added property 'ac_errors'
- added property 'subset'
- refactored attribute 'acquisition_date' to 'acq_datetime' containing a full datetime timestamp in UTC time zone
- to_GMS_file(): updated timestamp format

algorithms.L1A_P.L1A_object:
- get_MetaObj(): now also updates 'acq_datetime'
- refactored get_MetaObj() to import_metadata()
- included set_arr_desc_from_MetaObj() in import_metadata()
- calc_TOARadRefTemp(): updated in the context of "conversion_type_optical" value change to 'TOA_Ref'
- calc_cloud_mask(): temporarily excluded Sentinel-2 here; added code draft of S2A cloud mask calculation
- calc_corner_positions(): major revision -> now calculates trueDataCornerPos/-LonLat AND fullSceneCornerPos/-LonLat
- calc_center_AcqTime() now also updates 'acq_datetime'

algorithms.L1B_P.ref_Scene:
- moved _get_reference_image_params_pgSQL() and _sceneIDList_to_filt_overlap_scenes() to L1B_object

algorithms.L1C_P.L1C_object:   -> major revision
- added properties 'lonlat_arr', 'VZA_arr', 'VAA_arr', 'SZA_arr', 'SAA_arr', 'RAA_arr' based on get_lonlat_coord_array() and calc_acquisition_illumination_geometry()
- removed deprecated functions get_lonlat_coord_array() and calc_acquisition_illumination_geometry()
- revised delete_ac_input_arrays()

algorithms.L1C_P.AtmCorr:
- revised property 'logger'
- metadata: added some tests
- revised _meta_get_viewing_zenith(), _meta_get_viewing_azimuth(), _meta_get_relative_viewing_azimuth()
- added _meta_get_aux_data()
- added _get_dem()
- added dummy version of _get_srf()
- run_atmospheric_correction(): added docstring; some minor revisions
- _join_results_to_inObjs(): now working

algorithms.METADATA:
- added property AcqDateTime: returns a full datetime object with UTC timezone
- revised setters for AcqDate, AcqTime and AcqDateTime -> timezone now properly handled
- refactored 'Meta2ODict' to 'to_meta_odict'
- calc_center_acquisition_time(): now also sets AcqDateTime
- get_LayerBandsAssignment(): processing level is now properly handled (in the context of missing bands after atmospheric correction)

misc.database_tools:
- renamed keyword 'trueDataCornerLonLat' to 'tgt_corners_lonlat' in the following functions:
    - get_pgSQL_geospatial_query_cond()
    - get_overlapping_scenes_from_postgreSQLdb()
    - get_overlapping_MGRS_tiles()
    - get_overlapping_MGRS_tiles2()

misc.definitions_dicts:
- added is_dataset_provided_as_fullScene()

misc.exception_handler:
- log_uncaught_exceptions:  exception handling is now optional and can be turned off via config

misc.helper_functions:
- modified some docstrings

misc.logging.GMS_logger:
- added property 'captured_stream' (not yet working)
- added draft of StringIO handler (not yet working)

misc.logging.path-generator:
- get_path_srf_file: bugfix

processing.pipeline:
- L1A_map(): updated calls
- L1A_map_1(): updated calls
- revised L1C_map()

processing.process_controller:
- add_local_availability(): added comments and revised structure

config:
- Job:
    - added attribute 'disable_exception_handler'
    - added attribute 'scale_factor_BOARef'
    - added attribute 'scale_factor_errors_ac'
    - added assertion

pg_SQLdb:
- table config:
    - added fields 'scale_factor_BOARef' and 'scale_factor_errors_ac'
    - changed value of 'conversion_type_optical' to 'BOA_Ref'

- updated __version__
parent a53721f1
......@@ -15,7 +15,7 @@ from . import config
from .processing.process_controller import process_controller
__version__ = '20170111.01'
__version__ = '20170115.01'
__author__ = 'Daniel Scheffler'
__all__ = ['algorithms',
'io',
......
......@@ -301,7 +301,7 @@ class GEOPROCESSING(object):
:param cutNeg: bool: if true. all negative values turned to zero. default: True
:return: Int16 TOA_Reflectance in [0-10000]
"""
self.conversion_type_optical = 'Ref'
self.conversion_type_optical = 'TOA_Ref'
tempArr = gdalnumeric.LoadFile(self.desc) if not self.subset or self.subset[0] == 'cube' else \
gdalnumeric.LoadFile(self.desc,self.colStart,self.rowStart,self.cols,self.rows)
......@@ -341,7 +341,7 @@ class GEOPROCESSING(object):
:param v: verbose
:return: Int16 TOA_Reflectance in [0-10000]
"""
self.conversion_type_optical = 'Ref'
self.conversion_type_optical = 'TOA_Ref'
self.outpath_conv = self.workspace if outPath is None or outPath == "" else outPath
# --log arguments:
self.logger.info("\n\nGEOPROCESSING DN2TOARef:")
......@@ -1142,7 +1142,7 @@ class GEOPROCESSING(object):
:param mask_data_nodata: 2D-numpy array 1bit
:return: [UL, UR, LL, LR] as [(ULrow,ULcol),(URrow,URcol),...],[(ULLon,ULLat),(URLon,URLat),..]
"""
assert self.subset is None, 'Unable to calculate trueDataCornerPos for a subset of an image. ' \
assert self.subset is None, 'Unable to calculate fullSceneCornerPos for a subset of an image. ' \
'calc_FullDataset_corner_positions can only be called if GEOP object represents a full image.'
mask_1bit = mask_data_nodata if mask_data_nodata is not None else self.calc_mask_data_nodata()
......@@ -1171,7 +1171,7 @@ class GEOPROCESSING(object):
if True in [abs(np.diff(i)[0]) > 10 for i in
[StartStopRows_in_first_dataCol, StartStopRows_in_last_dataCol, StartStopCols_in_first_dataRow,
StartStopCols_in_last_dataRow]]:
''' In case of cut image corners (e.g. ALOS AVNIR-2 Level-1B2): Calculation of trueDataCornerPos outside of the image.'''
''' In case of cut image corners (e.g. ALOS AVNIR-2 Level-1B2): Calculation of fullSceneCornerPos outside of the image.'''
# print ('Image corners detected to be cut.')
def find_line_intersection_point(line1, line2):
xdiff = (line1[0][0] - line1[1][0], line2[0][0] - line2[1][0])
......@@ -2751,30 +2751,33 @@ def get_point_bounds(points):
return min_lon, min_lat, max_lon, max_lat
def zoom_2Darray_to_shapeFullArr(arr2D,shapeFullArr,subset=None,method='linear'):
def zoom_2Darray_to_shapeFullArr(arr2D,shapeFullArr,meshwidth=1,subset=None,method='linear'):
assert method in ['linear','nearest']
rS,rE,cS,cE = list(get_subsetProps_from_subsetArg(shapeFullArr,subset).values())[3:7]
rowpos = np.linspace(0,shapeFullArr[0]-1,arr2D.shape[0])
colpos = np.linspace(0,shapeFullArr[1]-1,arr2D.shape[1])
rgi = RegularGridInterpolator([rowpos,colpos],arr2D,method=method)
out_rows_grid, out_cols_grid = np.meshgrid(range(rS,rE),range(cS,cE),indexing='ij')
out_rows_grid, out_cols_grid = np.meshgrid(range(rS,rE,meshwidth),range(cS,cE,meshwidth),indexing='ij')
return rgi(np.dstack([out_rows_grid,out_cols_grid]))
def adjust_acquisArrProv_to_shapeFullArr(arrProv,shapeFullArr,subset=None,bandwise=False):
assert isinstance(arrProv,dict), 'Expected a dictionary with band names (from LayerbandsAssignment) as keys and ' \
'arrays as values. Got %s.' %type(arrProv)
if isinstance(arrProv[list(arrProv.keys())[0]],list):
arrProv = {k:np.array(v) for k,v in arrProv.items()}
shapes = [v.shape for v in arrProv.values()]
def adjust_acquisArrProv_to_shapeFullArr(arrProv, shapeFullArr, meshwidth=1, subset=None, bandwise=False):
assert isinstance(arrProv, dict) and isinstance(arrProv[list(arrProv.keys())[0]], list), \
'Expected a dictionary with band names (from LayerbandsAssignment) as keys and lists as ' \
'values. Got %s.' % type(arrProv)
arrProv = {k:np.array(v) for k,v in arrProv.items()}
shapes = [v.shape for v in arrProv.values()]
assert len(list(set(shapes)))==1, 'The arrays contained in the values of arrProv have different shapes. Got %s.' %shapes
if bandwise:
outDict = {k:zoom_2Darray_to_shapeFullArr(arr,shapeFullArr,subset) for k,arr in arrProv.items()}
outDict = {k:zoom_2Darray_to_shapeFullArr(arr,shapeFullArr,meshwidth,subset) for k,arr in arrProv.items()}
return outDict
else:
arr2interp = np.mean(np.dstack(arrProv.values()),axis=2)
interpolated = zoom_2Darray_to_shapeFullArr(arr2interp,shapeFullArr,subset).astype(np.float32)
interpolated = zoom_2Darray_to_shapeFullArr(arr2interp,shapeFullArr,meshwidth,subset).astype(np.float32)
return interpolated
......@@ -2982,29 +2985,24 @@ def is_granule(trueCornerPos): # TODO
# [transform_wgs84_to_utm(LL[0],LL[1],get_UTMzone(prj=prj)) for LL in LonLat]
def get_lonlat_coord_array(shape_fullArr, arr_pos, geotransform, projection, nodata_mask=None, outFill=None):
"""Returns numpy array containing longitude pixel coordinates (band 0) and latitude pixel coordinates
(band 1).
def get_lonlat_coord_array(shape_fullArr, arr_pos, geotransform, projection, meshwidth=1, nodata_mask=None, outFill=None):
"""Returns numpy array containing longitude pixel coordinates (band 0) and latitude pixel coordinates (band 1).
:param shape_fullArr:
:param arr_pos:
:param geotransform:
:param projection:
:param meshwidth: <int> defines the density of the mesh used for generating the output
(1: full resolution; 10: one point each 10 pixels)
:param nodata_mask: <numpy array>, used for declaring nodata values in the output lonlat array
:param outFill: the value that is assigned to nodata area in the output lonlat array"""
rows,cols,bands,rowStart,rowEnd,colStart,colEnd = get_subsetProps_from_shapeFullArr_arrPos(shape_fullArr,arr_pos)
xpos = np.array(range(colStart, colStart + cols), np.float32)
ypos = np.array(range(rowStart, rowStart + rows), np.float32)
xcoord_row = xpos * geotransform[1] + geotransform[0]
ycoord_col = ypos * geotransform[5] + geotransform[3]
xcoord_arr = np.empty((rows, cols), np.float32)
xcoord_arr[None, :] = xcoord_row
ycoord_arr = np.empty((cols, rows), np.float32)
ycoord_arr[None, :] = ycoord_col
ycoord_arr = ycoord_arr.T if sys.version_info[0] < 3 else ycoord_arr.T.flatten().reshape(-1, cols)
# ycoord_arr.T causes a RuntimeError in proj(xcoord_arr,ycoord_arr,inverse=True) under Python 3
xcoord_arr, ycoord_arr = np.meshgrid(range(colStart, colStart + cols, meshwidth),
range(rowStart, rowStart + rows, meshwidth))
xcoord_arr = xcoord_arr * geotransform[1] + geotransform[0]
ycoord_arr = ycoord_arr * geotransform[5] + geotransform[3]
assert projection, 'Unable to calculate LonLat array. Invalid projection. Got %s.' %projection
src_srs = osr.SpatialReference()
......@@ -3022,10 +3020,10 @@ def get_lonlat_coord_array(shape_fullArr, arr_pos, geotransform, projection, nod
outFill = outFill if outFill else get_outFillZeroSaturated('float32')[0]
xcoord_ycoord_arr[nodata_mask.astype(np.int8)==False] = outFill
UL_lonlat = (round(xcoord_ycoord_arr[0, 0, 0], 10), round(xcoord_ycoord_arr[0, 0, 1], 10))
UR_lonlat = (round(xcoord_ycoord_arr[0, cols - 1, 0], 10), round(xcoord_ycoord_arr[0, cols - 1, 1], 10))
LL_lonlat = (round(xcoord_ycoord_arr[rows - 1, 0, 0], 10), round(xcoord_ycoord_arr[rows - 1, 0, 1], 10))
LR_lonlat = (round(xcoord_ycoord_arr[rows - 1, cols-1, 0], 10), round(xcoord_ycoord_arr[rows-1, cols-1, 1], 10))
UL_lonlat = (round(xcoord_ycoord_arr[0 , 0 , 0], 10), round(xcoord_ycoord_arr[0 , 0 , 1], 10))
UR_lonlat = (round(xcoord_ycoord_arr[0 , -1, 0], 10), round(xcoord_ycoord_arr[0 , -1, 1], 10))
LL_lonlat = (round(xcoord_ycoord_arr[-1, 0 , 0], 10), round(xcoord_ycoord_arr[-1, 0 , 1], 10)) # TODO validate that coordinate
LR_lonlat = (round(xcoord_ycoord_arr[-1, -1, 0], 10), round(xcoord_ycoord_arr[-1, -1, 1], 10)) # TODO validate that coordinate
# self.logger.info(self.filename)
# self.logger.info('UL', UL_lat)
# self.logger.info('UR', UR_lat)
......@@ -3035,25 +3033,28 @@ def get_lonlat_coord_array(shape_fullArr, arr_pos, geotransform, projection, nod
return xcoord_ycoord_arr.astype(np.float32), [UL_lonlat, UR_lonlat, LL_lonlat, LR_lonlat]
def calc_VAA_using_trueCornerLonLat(trueDataCornerLonLat):
UL_LonLat, UR_LonLat, LL_LonLat, LR_LonLat = trueDataCornerLonLat
def calc_VAA_using_fullSceneCornerLonLat(fullSceneCornerLonLat):
# type: (list) -> float
UL_LonLat, UR_LonLat, LL_LonLat, LR_LonLat = fullSceneCornerLonLat
forward_az_left = pyproj.Geod(ellps='WGS84').inv(*LL_LonLat, *UL_LonLat)[0]
forward_az_right = pyproj.Geod(ellps='WGS84').inv(*LR_LonLat, *UR_LonLat)[0]
VAA_mean = np.mean([forward_az_left, forward_az_right])
VAA_mean = float(np.mean([forward_az_left, forward_az_right]))
return VAA_mean
def calc_VZA_array(shape_fullArr, arr_pos, trueDataCornerPos, viewing_angle, FOV, logger, nodata_mask=None,
outFill=None):
def calc_VZA_array(shape_fullArr, arr_pos, fullSceneCornerPos, viewing_angle, FOV, logger, meshwidth=1,
nodata_mask=None, outFill=None):
"""Calculates viewing zenith angles for each pixel in the dataset by solving
an equation system with 4 variables for each image corner:
VZA = a + b*col + c*row + d*col*row
:param shape_fullArr:
:param arr_pos:
:param trueDataCornerPos:
:param fullSceneCornerPos:
:param viewing_angle:
:param FOV:
:param logger:
:param meshwidth: <int> defines the density of the mesh used for generating the output
(1: full resolution; 10: one point each 10 pixels)
:param nodata_mask: <numpy array>, used for declaring nodata values in the output VZA array
:param outFill: the value that is assigned to nodata area in the output VZA array"""
......@@ -3061,17 +3062,18 @@ def calc_VZA_array(shape_fullArr, arr_pos, trueDataCornerPos, viewing_angle, FOV
"'nodata_mask' must be a numpy array or an instance of GeoArray. Got %s" % type(nodata_mask)
colsFullArr, rowsFullArr, bandsFullArr = shape_fullArr
rows,cols,bands,rowStart,rowEnd,colStart,colEnd = get_subsetProps_from_shapeFullArr_arrPos(shape_fullArr, arr_pos)
ul, ur, ll, lr = trueDataCornerPos
ul, ur, ll, lr = fullSceneCornerPos
if list(trueDataCornerPos) == list(
cols_arr, rows_arr = np.meshgrid(range(colStart, colStart + cols, meshwidth),
range(rowStart, rowStart + rows, meshwidth))
if list(fullSceneCornerPos) == list(
([0, 0], [0, colsFullArr - 1], [rowsFullArr - 1, 0], [rowsFullArr - 1, colsFullArr - 1])):
logger.warning('No precise calculation of VZA array possible because orbit track cannot be '
'reconstructed from dataset since true data corner positions are unknown. VZA array is '
'reconstructed from dataset since full scene corner positions are unknown. VZA array is '
'filled with the value provided in metadata.')
VZA_array = np.full((rows, cols), viewing_angle, np.int64)
VZA_array = np.full(cols_arr.shape, viewing_angle, np.int64) # respects meshwidth
else:
cols_arr, rows_arr = np.meshgrid(range(colStart, colStart + cols), range(rowStart, rowStart + rows))
coeff_matrix = np.array([[1, ul[1], ul[0], ul[1] * ul[0]],
[1, ur[1], ur[0], ur[1] * ur[0]],
[1, ll[1], ll[0], ll[1] * ll[0]],
......@@ -3092,12 +3094,15 @@ def calc_VZA_array(shape_fullArr, arr_pos, trueDataCornerPos, viewing_angle, FOV
return VZA_array
def calc_AcqTime_array(shape_fullArr, arr_pos, AcqDate, CenterAcqTime, trueDataCornerPos, overpassDurationSec):
ul, ur, ll, lr = trueDataCornerPos
def calc_AcqTime_array(shape_fullArr, arr_pos, AcqDate, CenterAcqTime, fullSceneCornerPos, overpassDurationSec,
meshwidth=1):
ul, ur, ll, lr = fullSceneCornerPos
rows,cols,bands,rowStart,rowEnd,colStart,colEnd = get_subsetProps_from_shapeFullArr_arrPos(shape_fullArr, arr_pos)
cols_arr, rows_arr = np.meshgrid(range(colStart, colStart+cols), range(rowStart, rowStart+rows))
cols_arr, rows_arr = np.meshgrid(range(colStart, colStart+cols, meshwidth),
range(rowStart, rowStart+rows, meshwidth))
time_center = datetime.datetime.strptime('%s %s' % (AcqDate, CenterAcqTime), '%Y-%m-%d %H:%M:%S')
time_start = time_center - datetime.timedelta(seconds=overpassDurationSec)
......@@ -3138,17 +3143,20 @@ def calc_SZA_SAA(date, lon, lat): # not used anymore since pyorbital is more pr
return SZA, SAA
def calc_SZA_SAA_array(shape_fullArr, arr_pos, AcqDate, CenterAcqTime, trueDataCornerPos, trueDataCornerLonLat,
overpassDurationSec, logger, nodata_mask=None, outFill=None, accurracy='coarse', lonlat_arr=None):
def calc_SZA_SAA_array(shape_fullArr, arr_pos, AcqDate, CenterAcqTime, fullSceneCornerPos, fullSceneCornerLonLat,
overpassDurationSec, logger, meshwidth=1, nodata_mask=None, outFill=None, accurracy='coarse',
lonlat_arr=None):
"""Calculates solar zenith and azimuth angles for each pixel in the dataset using pyorbital.
:param shape_fullArr:
:param arr_pos:
:param AcqDate:
:param CenterAcqTime:
:param trueDataCornerPos:
:param trueDataCornerLonLat:
:param fullSceneCornerPos:
:param fullSceneCornerLonLat:
:param overpassDurationSec:
:param logger:
:param meshwidth: <int> defines the density of the mesh used for generating the output
(1: full resolution; 10: one point each 10 pixels)
:param nodata_mask: <numpy array>, used for declaring nodata values in the output SZA/SAA array
:param outFill: the value that is assigned to nodata area in the output SZA/SAA array
:param accurracy: 'fine' or 'coarse'
......@@ -3165,26 +3173,26 @@ def calc_SZA_SAA_array(shape_fullArr, arr_pos, AcqDate, CenterAcqTime, trueDataC
"requires lonlat_arr to be specified. SZA/SAA is calculated with accurracy = 'coarse'."
assert accurracy in ['fine', 'coarse'], "The keyword accuracy accepts only 'fine' or 'coarse'. Got %s" % accurracy
ul, ur, ll, lr = trueDataCornerPos
ul, ur, ll, lr = fullSceneCornerPos
colsFullArr, rowsFullArr, bandsFullArr = shape_fullArr
if list(trueDataCornerPos) == list(
if list(fullSceneCornerPos) == list(
([0, 0], [0, colsFullArr - 1], [rowsFullArr - 1, 0], [rowsFullArr - 1, colsFullArr - 1])):
logger.warning('No precise calculation of SZA/SAA array possible because orbit track cannot '
'be reconstructed from dataset since true data corner positions are unknown. Same '
'be reconstructed from dataset since full scene corner positions are unknown. Same '
'acquisition time for each pixel is assumed for SZA/SAA calculation.')
time_center = datetime.datetime.strptime('%s %s' % (AcqDate, CenterAcqTime), '%Y-%m-%d %H:%M:%S')
time_start = time_center
time_end = time_center
else:
# overpassDurationSec = self.get_overpass_duration(trueDataCornerLonLat, orbitParams)[0]
# overpassDurationSec = self.get_overpass_duration(fullSceneCornerLonLat, orbitParams)[0]
time_center = datetime.datetime.strptime('%s %s' % (AcqDate, CenterAcqTime), '%Y-%m-%d %H:%M:%S')
time_start = time_center - datetime.timedelta(seconds=float(overpassDurationSec) / 2)
time_end = time_center + datetime.timedelta(seconds=float(overpassDurationSec) / 2)
if accurracy == 'fine':
datetime_arr = calc_AcqTime_array(shape_fullArr, arr_pos, AcqDate, CenterAcqTime,
trueDataCornerPos, overpassDurationSec)
fullSceneCornerPos, overpassDurationSec, meshwidth)
sol_alt_rad, sol_az_rad = astronomy.get_alt_az(datetime_arr, lonlat_arr[:, :, 0], lonlat_arr[:, :, 1])
SZA_array, SAA_array = 90 - (180 * sol_alt_rad / math.pi), 180 * sol_az_rad / math.pi
......@@ -3192,12 +3200,13 @@ def calc_SZA_SAA_array(shape_fullArr, arr_pos, AcqDate, CenterAcqTime, trueDataC
rows,cols,bands,rowStart,rowEnd,colStart,colEnd = \
get_subsetProps_from_shapeFullArr_arrPos(shape_fullArr, arr_pos)
cols_arr, rows_arr = np.meshgrid(range(colStart, colStart + cols), range(rowStart, rowStart + rows))
cols_arr, rows_arr = np.meshgrid(range(colStart, colStart + cols, meshwidth),
range(rowStart, rowStart + rows, meshwidth))
alt_UL_rad, az_UL_rad = astronomy.get_alt_az(time_end, trueDataCornerLonLat[0][0], trueDataCornerLonLat[0][1])
alt_UR_rad, az_UR_rad = astronomy.get_alt_az(time_end, trueDataCornerLonLat[1][0], trueDataCornerLonLat[1][1])
alt_LL_rad, az_LL_rad = astronomy.get_alt_az(time_start, trueDataCornerLonLat[2][0], trueDataCornerLonLat[2][1])
alt_LR_rad, az_LR_rad = astronomy.get_alt_az(time_start, trueDataCornerLonLat[3][0], trueDataCornerLonLat[3][1])
alt_UL_rad, az_UL_rad = astronomy.get_alt_az(time_end, fullSceneCornerLonLat[0][0],fullSceneCornerLonLat[0][1])
alt_UR_rad, az_UR_rad = astronomy.get_alt_az(time_end, fullSceneCornerLonLat[1][0],fullSceneCornerLonLat[1][1])
alt_LL_rad, az_LL_rad = astronomy.get_alt_az(time_start,fullSceneCornerLonLat[2][0],fullSceneCornerLonLat[2][1])
alt_LR_rad, az_LR_rad = astronomy.get_alt_az(time_start,fullSceneCornerLonLat[3][0],fullSceneCornerLonLat[3][1])
SZA_UL, SAA_UL = 90 - (180 * alt_UL_rad / math.pi), 180 * az_UL_rad / math.pi
SZA_UR, SAA_UR = 90 - (180 * alt_UR_rad / math.pi), 180 * az_UR_rad / math.pi
......@@ -3223,13 +3232,12 @@ def calc_SZA_SAA_array(shape_fullArr, arr_pos, AcqDate, CenterAcqTime, trueDataC
return SZA_array, SAA_array
def calc_RAA_array(trueDataCornerLonLat, SAA_array, VAA_mean=None, nodata_mask=None, outFill=None):
def calc_RAA_array(SAA_array, VAA_array, nodata_mask=None, outFill=None):
if nodata_mask is not None: assert isinstance(nodata_mask, (GeoArray, np.ndarray)), \
"'nodata_mask' must be a numpy array or an instance of GeoArray. Got %s" % type(nodata_mask)
if VAA_mean is None:
VAA_mean = calc_VAA_using_trueCornerLonLat(trueDataCornerLonLat)
RAA_array = SAA_array - VAA_mean
RAA_array = SAA_array - VAA_array
if nodata_mask is not None:
RAA_array[nodata_mask.astype(np.int8) == False] = outFill
return RAA_array
......
......@@ -27,7 +27,7 @@ from . import gms_cloud_classifier as CLD_P # Cloud Processor
from ..io import Output_writer as OUT_W
from ..misc import helper_functions as HLP_F
from ..misc import path_generator as PG
from ..misc.definition_dicts import get_outFillZeroSaturated
from ..misc.definition_dicts import get_outFillZeroSaturated, is_dataset_provided_as_fullScene
from .gms_object import GMS_object
......@@ -35,7 +35,7 @@ from .gms_object import GMS_object
class L1A_object(GMS_object):
"""Features input reader and raster-/metadata homogenization."""
def __init__(self, image_type='', satellite='', sensor='', subsystem='', sensormode='', acquisition_date=None,
def __init__(self, image_type='', satellite='', sensor='', subsystem='', sensormode='', acq_datetime=None,
entity_ID='', scene_ID=-9999, filename='', dataset_ID=-9999):
""":param : instance of gms_object.GMS_object or None
"""
......@@ -45,17 +45,17 @@ class L1A_object(GMS_object):
super().__init__()
# unpack kwargs
self.proc_level = 'L1A'
self.image_type = image_type # FIXME not needed anymore?
self.satellite = satellite
self.sensor = sensor
self.subsystem = subsystem
self.sensormode = sensormode
self.acquisition_date = acquisition_date
self.entity_ID = entity_ID
self.scene_ID = scene_ID
self.filename = filename
self.dataset_ID = dataset_ID
self.proc_level = 'L1A'
self.image_type = image_type # FIXME not needed anymore?
self.satellite = satellite
self.sensor = sensor
self.subsystem = subsystem
self.sensormode = sensormode
self.acq_datetime = acq_datetime
self.entity_ID = entity_ID
self.scene_ID = scene_ID
self.filename = filename
self.dataset_ID = dataset_ID
# set pathes (only if there are valid init args)
if image_type and satellite and sensor:
......@@ -348,7 +348,7 @@ class L1A_object(GMS_object):
# create and fill raster object
if n_files2search > 1:
if re.search('ETM+',self.sensor) and self.acquisition_date>datetime.datetime(year=2003,month=5,day=31):
if re.search('ETM+',self.sensor) and self.acq_datetime>datetime.datetime(year=2003, month=5, day=31):
expected_files_count = 2*len(full_LayerBandsAssignment)
else:
expected_files_count = len(full_LayerBandsAssignment)
......@@ -449,7 +449,7 @@ class L1A_object(GMS_object):
ds=None
def get_MetaObj(self, v=False):
def import_metadata(self, v=False):
"""Reads metainformation of the given file from the given ASCII metafile.
Works for: RapidEye (metadata.xml),SPOT(metadata.dim),LANDSAT(mtl.txt),ASTER(downloaded coremetadata),
ALOS(summary.txt & Leader file)
......@@ -457,9 +457,13 @@ class L1A_object(GMS_object):
:param v:
:return:
"""
self.logger.info('Reading %s %s %s metadata...' %(self.satellite,self.sensor,self.subsystem))
print('h1')
self.MetaObj = META.METADATA(self.satellite, self.subsystem, self.scene_ID, self.path_InFilePreprocessor,
self.path_MetaPreprocessor, self.logger, self.LayerBandsAssignment)
self.path_MetaPreprocessor, self.logger, self.LayerBandsAssignment) # TODO property (auto-conversion from meta_odict needed)
print('h2')
if v:
self.logger.info("The following metadata have been read:")
[self.logger.info("%20s : %-4s" % (key, val)) for key, val in self.MetaObj.get_meta_overview().items()]
......@@ -468,20 +472,22 @@ class L1A_object(GMS_object):
self.subsystem = self.MetaObj.Subsystem
self.arr.nodata = self.MetaObj.spec_vals['fill']
# update acquisition date to a complete datetime object incl. date, time and timezone
if self.MetaObj.AcqDateTime:
self.acq_datetime = self.MetaObj.AcqDateTime
def set_arr_desc_from_MetaObj(self):
"""Sets self.arr_desc according to MetaObj.PhysUnit. To be called directly after getting MetaObj.
"""
# TODO property
self.arr_desc = 'DN' if self.MetaObj.PhysUnit=='DN' else \
'Rad' if self.MetaObj.PhysUnit=="W * m-2 * sr-1 * micrometer-1" else \
'Ref' if re.search('TOA_Reflectance',self.MetaObj.PhysUnit,re.I) else \
'Temp' if re.search('Degrees Celsius',self.MetaObj.PhysUnit,re.I) else None # FIXME TOA_Ref
assert self.arr_desc, 'L1A_obj.MetaObj contains an unexpected physical unit: %s' %self.MetaObj.PhysUnit
# set arr_desc
self.arr_desc = 'DN' if self.MetaObj.PhysUnit == 'DN' else \
'Rad' if self.MetaObj.PhysUnit == "W * m-2 * sr-1 * micrometer-1" else \
'TOA_Ref' if re.search('TOA_Reflectance', self.MetaObj.PhysUnit, re.I) else \
'BOA_Ref' if re.search('BOA_Reflectance', self.MetaObj.PhysUnit, re.I) else \
'Temp' if re.search('Degrees Celsius', self.MetaObj.PhysUnit, re.I) else None
assert self.arr_desc, 'GMS_obj contains an unexpected physical unit: %s' % self.MetaObj.PhysUnit
def calc_TOARadRefTemp(self, subset=None):
"""Convert DN or Ref data to TOA Reflectance, to Radiance or to Surface Temperature
"""Convert DN, Rad or TOA_Ref data to TOA Reflectance, to Radiance or to Surface Temperature
(depending on CFG.usecase.conversion_type_optical and conversion_type_thermal).
The function can be executed by a L1A_object representing a full scene or a tile. To process a file from disk
in tiles, provide an item of self.tile_pos as the 'subset' argument."""
......@@ -513,9 +519,10 @@ class L1A_object(GMS_object):
for optical_thermal in ['optical', 'thermal']:
if optical_thermal not in self.dict_LayerOptTherm.values(): continue
conv = getattr(CFG.usecase, 'conversion_type_%s' % optical_thermal)
assert conv in ['Rad', 'Ref', 'Temp'], 'Unsupported conversion type: %s' %conv
conv = conv if conv!='BOA_Ref' else 'TOA_Ref'
assert conv in ['Rad', 'TOA_Ref', 'Temp'], 'Unsupported conversion type: %s' %conv
arr_desc = self.arr_desc.split('/')[0] if optical_thermal == 'optical' else self.arr_desc.split('/')[-1]
assert arr_desc in ['DN','Rad', 'Ref', 'Temp'], 'Unsupported array description: %s' %arr_desc
assert arr_desc in ['DN','Rad', 'TOA_Ref', 'Temp'], 'Unsupported array description: %s' %arr_desc
bList = [i for i,lr in enumerate(self.LayerBandsAssignment) if self.dict_LayerOptTherm[lr]==optical_thermal]
......@@ -529,18 +536,18 @@ class L1A_object(GMS_object):
if arr_desc == 'DN':
self.log_for_fullArr_or_firstTile('Calculating %s...' %conv, subset)
OFF, GAI, IRR, K1, K2 = [list(np.array(par)[bList]) for par in [off, gai, irr, k1, k2]]
res = GEOP.DN2Rad(inArray, OFF, GAI, inFill, inZero, inSaturated) if conv == 'Rad' else \
GEOP.DN2TOARef(inArray,OFF,GAI,IRR,zen,esd,inFill,inZero,inSaturated) if conv == 'Ref' else \
res = GEOP.DN2Rad(inArray, OFF, GAI, inFill, inZero, inSaturated) if conv == 'Rad' else \
GEOP.DN2TOARef(inArray,OFF,GAI,IRR,zen,esd,inFill,inZero,inSaturated) if conv == 'TOA_Ref' else \
GEOP.DN2DegreesCelsius_fastforward(inArray,OFF,GAI,K1,K2,0.95,inFill,inZero,inSaturated)
if conv=='Ref': self.MetaObj.ScaleFactor = CFG.usecase.scale_factor_TOARef
if conv=='TOA_Ref': self.MetaObj.ScaleFactor = CFG.usecase.scale_factor_TOARef
elif arr_desc == 'Rad':
raise NotImplementedError("Conversion Rad to %s is currently not supported." %conv)
elif arr_desc == 'Ref':
elif arr_desc == 'TOA_Ref':
if conv=='Rad':
raise NotImplementedError("Conversion Ref to Rad is currently not supported." % conv)
else: # conv=='Ref'
raise NotImplementedError("Conversion TOA_Ref to %s is currently not supported." %conv)
else: # conv=='TOA_Ref'
if self.MetaObj.ScaleFactor != CFG.usecase.scale_factor_TOARef:
res = self.rescale_array(inArray, CFG.usecase.scale_factor_TOARef, self.MetaObj.ScaleFactor)
self.MetaObj.ScaleFactor = CFG.usecase.scale_factor_TOARef
......@@ -549,7 +556,7 @@ class L1A_object(GMS_object):
else:
res = inArray
self.log_for_fullArr_or_firstTile('The input data already represents TOA '
'reflectance with the aimed scale factor of %d.' %CFG.usecase.scale_factor_TOARef)
'reflectance with the desired scale factor of %d.' %CFG.usecase.scale_factor_TOARef)
else: # arr_desc == 'Temp'
raise NotImplementedError("Conversion Temp to %s is currently not supported." %conv)
......@@ -656,7 +663,7 @@ class L1A_object(GMS_object):
if rasObj.get_projection_type() == 'UTM':
self.MetaObj.CornerTieP_UTM = rasObj.get_corner_coordinates('UTM')
self.meta_odict = self.MetaObj.Meta2ODict() # important in order to keep geotransform/projection
self.meta_odict = self.MetaObj.to_meta_odict() # important in order to keep geotransform/projection
if CFG.job.exec_mode=='Flink':
self.delete_tempFiles() # these files are needed later in Python execution mode
self.MetaObj.Dataname = previous_dataname # /vsi.. pointing directly to raw data archive (which exists)
......@@ -679,7 +686,7 @@ class L1A_object(GMS_object):
# inPath = self.arr if arr_isPath else self.MetaObj.Dataname if \
# (hasattr(self,'MetaObj') and self.MetaObj) else self.meta_odict['Dataname'] # FIXME ersetzen durch path generator?
if not self.path_cloud_class_obj:
if not self.path_cloud_class_obj or self.satellite=='Sentinel-2A': # FIXME dont exclude S2 here
self.log_for_fullArr_or_firstTile('Cloud masking is not yet implemented for %s %s...'
%(self.satellite,self.sensor),subset)
mask_clouds = None
......@@ -691,6 +698,18 @@ class L1A_object(GMS_object):
# rasObj = GEOP.GEOPROCESSING(inPath, self.logger, subset=subset)
# self.arr = rasObj.tondarray(direction=3)
# TODO Sentinel-2 mask:
#if not hasattr(s2img, "mask_clouds"):
# logger.info("Cloud mask missing -> derive own cloud mask.")
# CldMsk = CloudMask(logger=logger, persistence_file=options["cld_mask"]["persistence_file"],
# processing_tiles=options["cld_mask"]["processing_tiles"])
# s2img.mask_clouds = CldMsk(S2_img=s2img, target_resolution=options["cld_mask"]["target_resolution"],
# majority_filter_options=options["cld_mask"]["majority_mask_filter"],
# nodata_value=options["cld_mask"]['nodata_value_mask'])
# del CldMsk
self.GMS_identifier['logger'] = self.logger
if not CFG.job.bench_CLD_class:
self.path_cloud_class_obj = PG.get_path_cloud_class_obj(self.GMS_identifier)
......@@ -777,44 +796,75 @@ class L1A_object(GMS_object):
return {'desc':'mask_clouds','row_start':rS,'row_end':rE,'col_start':cS,'col_end':cE,'data':mask_clouds}
def calc_corner_positions(self):
def calc_corner_positions(self): # TODO revise that function (must return fullSceneCornerLonLat
"""Get true corner positions in the form
[UL, UR, LL, LR] as [(ULrow,ULcol),(URrow,URcol),...],[(ULLon,ULLat),(URLon,URLat),..]"""
# set lonlat corner positions for outer image edges
rows,cols = self.shape_fullArr[:2]
CorPosXY = [(0,0),(cols,0),(0,rows),(cols,rows)]
gt = self.mask_nodata.geotransform
prj = EPSG2WKT(self.mask_nodata.epsg) # using EPSG code ensures that excactly the same WKT code is used in Flink and Python mode
CorLatLon = pixelToLatLon(CorPosXY,geotransform=gt,projection=prj)
self.corner_lonlat = [tuple(reversed(i)) for i in CorLatLon]
# set true data corner positions (image coordinates)
assert self.arr_shape=='cube', 'The given GMS object must represent the full image cube for calculating ,' \
'true corner posistions. Received %s.' %self.arr_shape
assert hasattr(self,'mask_nodata') and self.mask_nodata is not None, "The L1A object needs to have a nodata mask."
self.logger.info('Calculating true data corner positions (image and world coordinates)...')
if re.search('ETM+', self.sensor) and self.acquisition_date > datetime.datetime(year=2003, month=5, day=31): # FIXME add time
self.trueDataCornerPos = calc_FullDataset_corner_positions(self.mask_nodata, algorithm='numpy')
if re.search('ETM+', self.sensor) and self.acq_datetime > datetime.datetime(year=2003, month=5, day=31):
self.trueDataCornerPos = calc_FullDataset_corner_positions(self.mask_nodata, algorithm='numpy',
assert_four_corners=False)
else:
self.trueDataCornerPos = calc_FullDataset_corner_positions(self.mask_nodata)
trueCorPosXY = [tuple(reversed(i)) for i in self.trueDataCornerPos]
gt = self.mask_nodata.geotransform
prj = EPSG2WKT(self.mask_nodata.epsg) # using EPSG code ensures that excactly the same WKT code is used in Flink and Python mode
self.trueDataCornerPos = calc_FullDataset_corner_positions(self.mask_nodata, assert_four_corners=False)
# set true data corner positions (lon/lat coordinates)
trueCorPosXY = [tuple(reversed(i)) for i in self.trueDataCornerPos]
trueCorLatLon = pixelToLatLon(trueCorPosXY,geotransform=gt,projection=prj)
self.trueDataCornerLonLat = [tuple(reversed(i)) for i in trueCorLatLon]
rows,cols = self.shape_fullArr[:2]
CorPosXY = [(0,0),(cols,0),(0,rows),(cols,rows)]
CorLatLon = pixelToLatLon(CorPosXY,geotransform=gt,projection=prj)
self.corner_lonlat = [tuple(reversed(i)) for i in CorLatLon]
# set full scene corner positions (image and lonlat coordinates)
if is_dataset_provided_as_fullScene(self.GMS_identifier):
assert len(self.trueDataCornerLonLat)==4, "A %s %s dataset is provided as full scene. Thus all 4 image " \
"corners must be present within the dataset. Found %s." %len(self.trueDataCornerLonLat)
self.fullSceneCornerPos = self.trueDataCornerPos
self.fullSceneCornerLonLat = self.trueDataCornerLonLat
else:
if re.search('AVNIR', self.sensor):
self.fullSceneCornerPos = calc_FullDataset_corner_positions(self.mask_nodata, algorithm='numpy',
assert_four_corners=False)
# set true data corner positions (lon/lat coordinates)
trueCorPosXY = [tuple(reversed(i)) for i in self.trueDataCornerPos]
trueCorLatLon = pixelToLatLon(trueCorPosXY, geotransform=gt, projection=prj)
self.fullSceneCornerLonLat = [tuple(reversed(i)) for i in trueCorLatLon]
else: # RapidEye or Sentinel-2 data
if re.search('Sentinel-2', self.satellite):
# get fullScene corner coordinates by database query
# -> calculate footprints for all granules of the same S2 datatake -> merge them and calculate overall corner positions
self.fullSceneCornerPos = [tuple(reversed(i)) for i in CorPosXY] # outer corner positions
self.fullSceneCornerLonLat = self.corner_lonlat # outer corner positions
else: # RapidEye
raise NotImplementedError # FIXME
def <