Commit 2b221626 authored by Daniel Scheffler's avatar Daniel Scheffler
Browse files

First version that is able to produce MGRS cubes in mass production (so far...

First version that is able to produce MGRS cubes in mass production (so far only fully operable for Landsat-8)

GEOP:
- merged calc_FullDataset_corner_positionsOLD() and calc_FullDataset_corner_positions() to calc_FullDataset_corner_positions()
	- the function now features two algorithms: 'numpy' (for Landsat-7 ETM+ SLC-off) and 'shapely' (for all remaining scenes)
- deleted calc_FullDataset_corner_positionsOLD()
- added is_point_on_grid(): Checks if a given point is exactly on the given coordinate grid. Needed to check if scene has to be coregistered or not.
- added is_coord_grid_equal(): Needed to check if scene has to be coregistered or not.

L0A_P:
- updated name of jobs table in postgreSQL database

L1A_P:
- added attribute dataset_ID -> Needed to check if scene has to be coregistered or not.
- added attributes scenes_proc_ID and mgrs_tiles_proc_ID -> needed for database entries
- archive_to_rasObj() now asserts that the read archive contains any files
- bugfix for Landsat-7 SLC-off scenes: archive_to_rasObj() now expects more files in the archive due to SLC off masks
- bugfix calc_mask_nodata(): data had been read from wrong source if job.exec_mode=='Python'
- revised calc_cloud_mask() -> the resulting array is now contains correct pixel values and is properly included in GMS_obj.masks
- bugfix for Landsat-7 SLC-off scenes: calc_corner_positions() now calls a different algorithm that can deal with SLC-off gaps
- bugfix calc_corner_positions(): incorrect getter for projection
- bugfix build_combined_masks_array(): it now out always outputs unsigned integer 8bit
- added delete_previous_proc_level_results(): needed to save disk space by deleting results of previous processing levels that are not needed anymore
- delete_tempFiles() now also deletes results of previous processing levels on demand (depending on job.exec_L1AP[2])

L1B_P:
- added class COREG_GMS (not yet working) -> will replace class COREG soon and will synchronize its changes with external library CoReg_Sat
- COREG.__init__():
	- added attributes dataset_ID and coreg_needed -> Needed to check if scene has to be coregistered or not.
	-> the class now only runs coregistration if its really needed. otherwise its skipped
- COREG.get_updated_map_info(): now no output stream anymore in quiet mode
- L1B_object:
	- slighty changed logging behavior
	- apply_deshift_results(): - bugfix for not properly updated map info and projection
	- apply_deshift_results(): - bugfix for not properly updated dimensions of the mask attributes

L1C_P:
- delete_tempfiles() now also deletes results of previous processing levels  on demand

L2A_P:
- bugfix in get_DESHIFTER_configs(): not properly closed gdal dataset
- added attributes shift_dataset_ID, x_shift_px, y_shift_px and deshift_needed in order to check if deshifting is really needed or not
- correct_shifts():
	- added proper logging
	- bugfix for missing updated_projection in case coregistration has been skipped
	- bugfix for not properly closed gdal dataset
	- bugfix for wrong resampling method for cloud mask (cubic results in wrong output pixel values) -> now nearest neighbour
	- added 'is clipped to extent' to deshift_results

L1C_P:
- bugfix spectral_homogenization(): now it also accepts string type array attributes if exec_mode=='Python'
- revised interpolate_cube_linear()

META:
- CS_DATUM, CS_UTM_ZONE, CS_TYPE are now properly written to gms-files
- added metaDict_to_metaODict() needed by output writer

INP_R:
- read_mask_subset(): added missing 'MGRS_tile' subset type

OUT_W:
- added silent_envi_write_image(): a monkey patch in order to silence spectral.io.envi._write_image
- added write_ordered_envi_header(): in order to write properly sorted ENVI header files
- revised Tiles_Writer(): previous version caused output errors for mask attributes of GMS objects
- revised mask_to_ENVI_Classification(): bugfixes for wrong conversions of numpy arrays to ENVI classdification ready array and metadata
- added check_header_not_empty()
- major revision of Obj2ENVI(): now it properly writes MRGS tiles if exec_mode=='Python'
- added database update for MGRS tiles (new records created in new database table 'mgrs_tiles_proc')

DB_T:
- splitted get_postgreSQL_matchingExp() to get_postgreSQL_value() and get_postgreSQL_matchingExp()
	-> much more python variable types can now be converted to postgreSQL types
- added create_record_in_postgreSQLdb(): needed to create new records for MGRS tiles
- added class GMS_JOB: a job manager for GMS jobs
	-> current features:
		- create a job in the postgreSQL database, based on a list of archive files
		- get the details about an existing job
- added add_missing_filenames_in_pgSQLdb(): useful for finding and fixing database errors
- import_shapefile_into_postgreSQL_database() updated name of postgreSQL database
- major revision of data_DB_updater(): now it properly updates both tables 'scenes_proc' and 'mgrs_scenes_proc'

HLP_F:
- added attribute proc_chain
- get_subset_GMS_obj(): bugfix for clipping the wrong tile dimension
- get_subset_GMS_obj(): added updated tile corners and data corners to MGRS tiles
- cut_GMS_obj_into_MGRS_tiles(): added calculation of tile bounds as well as data corner positions
- _numba_array_merger(): added a type assertion
- get_valid_arrSubsetBounds(): bugfix for returning wrong tile dimensions (1 row/column too big)
- sceneID_to_trueDataCornerLonLat(): added valid polygon assertion
- added class Landsat_entityID_decrypter: needed by class JOB_GMS for finding correct sensor parameters for Landsat data

PG:
- added get_path_rawdata()
- get_path_procdata(): changed returned path structure for MGRS tiles saved to disk
- get_baseN(): changed returned basename for GMS objects on disk

CFG:
- updated postgreSQL database connection credentials
- added a "delete output of previous porcessing levels"-flag to exec__L**P
- usecase: added datasetid_spatial_ref, virtual_sensor_name, target_gsd, EPSG, spatial_ref_gridx, spatial_ref_gridy

PC:
- added multiple job IDs containing pre-defined database jobs
- updated L1B_map_1(): shift calculation is now skipped if no reference scene is needed or available
- added L2A_map(): needed to execute processing in mass production, controlled by Python process controller
- revised tempfile deletion within L1C_map_1(), L2A_map_1(), L2B_map_1()
- updated L2A mapper calls -> exec_mode=='Python' is now supported
- added hardcoded thread count restriction to L2B mapper call due to memory errors in Python execution mode with parallelization level 'scenes'

pgSQLdb:
- renamed database from usgs_cache to geomultisens
- added table 'mrgs_tiles_proc'
- renamed table 'scenes_jobs' to 'jobs'
- updated jobs table with new columns
- fixed missing filenames in scenes table
- ...
parent ec08aa8f
......@@ -2782,7 +2782,7 @@ def warp_ndarray(ndarray, in_gt, in_prj, out_prj, out_gt=None, outRowsCols=None,
if not ndarray.flags['OWNDATA']:
temp = np.empty_like(ndarray)
temp[:] = ndarray
ndarray = temp # deep copy: converts view to its own array in oder to avoid wrong output
ndarray = temp # deep copy: converts view to its own array in order to avoid wrong output
with rasterio.drivers():
if outUL is not None:
......@@ -2990,7 +2990,8 @@ def pixelToLatLon(pixelPairs, path_im=None, geotransform=None, projection=None):
"""The following method translates given pixel locations into latitude/longitude locations on a given GEOTIF
This method does not take into account pixel size and assumes a high enough
image resolution for pixel size to be insignificant
:param pixelPairs: The pixel pairings to be translated in the form [[x1,y1],[x2,y2]]
:param pixelPairs: Image coordinate pairs to be translated in the form [[x1,y1],[x2,y2]]
:param path_im: The file location of the input image
:param geotransform: GDAL GeoTransform
:param projection: GDAL Projection
......@@ -3025,6 +3026,7 @@ def pixelToLatLon(pixelPairs, path_im=None, geotransform=None, projection=None):
def pixelToMapYX(pixelCoords, path_im=None, geotransform=None, projection=None):
"""Translates given pixel locations into map coordinates of the given image.
:param pixelCoords: The pixel coordinates to be translated in the form [x,y].
A list of coordinate pairs is possible: [[x1,y1],[x2,y2]].
:param path_im: absolute path of the image to be used to get geotransform and projection information
......@@ -3173,7 +3175,7 @@ def mapinfo2geotransform(map_info):
return [float(map_info[3]),float(map_info[5]),0.,float(map_info[4]),0.,-float(map_info[6])]
def get_corner_coordinates(gdal_ds=None, gt=None, cols=None, rows=None):
"""Return (UL, LL, LR, UR)"""
"""Returns (UL, LL, LR, UR) in the same coordinate units like the given geotransform."""
assert gdal_ds or (gt and cols and rows), \
"GEOP.get_corner_coordinates: Missing argument! Please provide either 'gdal_ds' or 'gt', 'cols' AND 'rows'."
gdal_ds_GT = gdal_ds.GetGeoTransform() if gdal_ds else gt
......@@ -3429,116 +3431,118 @@ def DN2DegreesCelsius_fastforward(ndarray,offsets, gains, K1, K2, emissivity=0.9
#degCel = np.piecewise(degCel,[ndarray==inFill,ndarray==inSaturated,ndarray==inZero],[outFill,outSaturated,outZero])
return degCel
def calc_FullDataset_corner_positionsOLD(mask_1bit):
"""
Calculates the image coordinates of the true data corners for rotated datasets.
ONLY usable for entire images - no tiles!
:param mask_1bit: 2D-numpy array 1bit
:return: [UL, UR, LL, LR] as [(ULrow,ULcol),(URrow,URcol),...]
"""
rows,cols = mask_1bit.shape[:2]
cols_containing_data = np.any(mask_1bit, axis=0)
rows_containing_data = np.any(mask_1bit, axis=1)
first_dataCol = list(cols_containing_data).index(True)
last_dataCol = cols - (list(reversed(cols_containing_data)).index(True) + 1)
first_dataRow = list(rows_containing_data).index(True)
last_dataRow = rows - (list(reversed(rows_containing_data)).index(True) + 1)
if first_dataCol == 0 and first_dataRow == 0 and last_dataCol == cols-1 and last_dataRow == rows-1 \
and mask_1bit[0, 0] == True and mask_1bit[0, cols - 1] == True \
and mask_1bit[rows - 1, 0] == True and mask_1bit[rows - 1, cols - 1] == True:
UL, UR, LL, LR = (0, 0), (0, cols - 1), (rows - 1, 0), (rows - 1, cols - 1)
else:
StartStopRows_in_first_dataCol = [list(mask_1bit[:, first_dataCol]).index(True),
(rows - list(reversed(mask_1bit[:, first_dataCol])).index(True) + 1)]
StartStopRows_in_last_dataCol = [list(mask_1bit[:, last_dataCol]).index(True),
(rows - list(reversed(mask_1bit[:, last_dataCol])).index(True) + 1)]
StartStopCols_in_first_dataRow = [list(mask_1bit[first_dataRow, :]).index(True),
(cols - list(reversed(mask_1bit[first_dataRow, :])).index(True) + 1)]
StartStopCols_in_last_dataRow = [list(mask_1bit[last_dataRow, :]).index(True),
(cols - list(reversed(mask_1bit[last_dataRow, :])).index(True) + 1)]
if True in [abs(np.diff(i)[0]) > 10 for i in
[StartStopRows_in_first_dataCol, StartStopRows_in_last_dataCol, StartStopCols_in_first_dataRow,
StartStopCols_in_last_dataRow]]:
''' In case of cut image corners (e.g. ALOS AVNIR-2 Level-1B2): Calculation of trueDataCornerPos outside of the image.'''
# print ('Image corners detected to be cut.')
def find_line_intersection_point(line1, line2):
xdiff = (line1[0][0] - line1[1][0], line2[0][0] - line2[1][0])
ydiff = (line1[0][1] - line1[1][1], line2[0][1] - line2[1][1])
def det(a, b):
return a[0] * b[1] - a[1] * b[0]
div = det(xdiff, ydiff)
if div == 0:
raise Exception('lines do not intersect')
d = (det(*line1), det(*line2))
x = det(d, xdiff) / div
y = det(d, ydiff) / div
return x, y
line_N = ((StartStopCols_in_first_dataRow[1], first_dataRow),
(last_dataCol, StartStopRows_in_last_dataCol[0]))
line_S = ((first_dataCol, StartStopRows_in_first_dataCol[1]),
(StartStopCols_in_last_dataRow[0], last_dataRow))
line_W = ((StartStopCols_in_first_dataRow[0], first_dataRow),
(first_dataCol, StartStopRows_in_first_dataCol[0]))
line_O = ((last_dataCol, StartStopRows_in_last_dataCol[1]),
(StartStopCols_in_last_dataRow[1], last_dataRow))
corners = [list(reversed(find_line_intersection_point(line_N, line_W))),
list(reversed(find_line_intersection_point(line_N, line_O))),
list(reversed(find_line_intersection_point(line_S, line_W))),
list(reversed(find_line_intersection_point(line_S, line_O)))]
else:
dataRow_in_first_dataCol = np.mean(StartStopRows_in_first_dataCol)
dataRow_in_last_dataCol = np.mean(StartStopRows_in_last_dataCol)
dataCol_in_first_dataRow = np.mean(StartStopCols_in_first_dataRow)
dataCol_in_last_dataRow = np.mean(StartStopCols_in_last_dataRow)
corners = [(first_dataRow, dataCol_in_first_dataRow), (dataRow_in_last_dataCol, last_dataCol),
(last_dataRow, dataCol_in_last_dataRow), (dataRow_in_first_dataCol, first_dataCol)]
getClosest = lambda refR, refC: \
corners[np.array([np.sqrt((refR - c[0]) ** 2 + (refC - c[1]) ** 2) for c in corners]).argmin()]
UL,UR,LL,LR = [getClosest(refR,refC) for refR,refC in [(0,0),(0,cols-1),(rows-1,0),(rows-1,cols-1)]]
# if UL[0] == 0 and UR[0] == 0:
# envi.save_image(os.path.abspath('./testing/out/__faulty_mask_1bit.hdr'), self.mask_1bit, dtype= 'uint8', interleave='bsq', ext='.bsq',force=True)
#print(UL,UR,LL,LR)
return UL, UR, LL, LR
def calc_FullDataset_corner_positions(mask_1bit):
def calc_FullDataset_corner_positions(mask_1bit, assert_four_corners=True, algorithm='shapely'):
"""
Calculates the image coordinates of the true data corners for rotated datasets.
ONLY usable for entire images - no tiles!
NOTE: Algorithm calculates the corner coordinates of the convex hull of the given mask. Since the convex hull not
always reflects all of the true corner coordinates the result has a limitation in this regard.
:param mask_1bit: 2D-numpy array 1bit
:return: [UL, UR, LL, LR] as [(ULrow,ULcol),(URrow,URcol),...]
:param mask_1bit: 2D-numpy array 1bit
:param assert_four_corners:
:param algorithm: <str> 'shapely' or 'numpy'
:return: [UL, UR, LL, LR] as [(ULrow,ULcol),(URrow,URcol),...]
"""
rows, cols = mask_1bit.shape[:2]
if sum([mask_1bit[0,0], mask_1bit[0,cols-1], mask_1bit[rows-1,0], mask_1bit[rows-1,cols-1]]) == 1:
UL, UR, LL, LR = [(0,0),(0,cols-1),(rows-1,0),(rows-1,cols-1)]
out = [(0,0),(0,cols-1),(rows-1,0),(rows-1,cols-1)] # UL, UR, LL, LR
else:
# get outline
topbottom = np.empty((1, 2 * mask_1bit.shape[1]), dtype=np.uint16)
topbottom[0, 0:mask_1bit.shape[1]] = np.argmax(mask_1bit, axis=0)
topbottom[0, mask_1bit.shape[1]:] = (mask_1bit.shape[0] - 1) - np.argmax(np.flipud(mask_1bit), axis=0)
mask = np.tile(np.any(mask_1bit, axis=0), (2,))
xvalues = np.tile(np.arange(mask_1bit.shape[1]), (1, 2))
outline = np.vstack([topbottom, xvalues])[:, mask].T
assert algorithm in ['shapely','numpy'], "Algorithm '%' does not exist. Choose between 'shapely' and 'numpy'."
if algorithm == 'shapely':
# get outline
topbottom = np.empty((1, 2 * mask_1bit.shape[1]), dtype=np.uint16)
topbottom[0, 0:mask_1bit.shape[1]] = np.argmax(mask_1bit, axis=0)
topbottom[0, mask_1bit.shape[1]:] = (mask_1bit.shape[0] - 1) - np.argmax(np.flipud(mask_1bit), axis=0)
mask = np.tile(np.any(mask_1bit, axis=0), (2,))
xvalues = np.tile(np.arange(mask_1bit.shape[1]), (1, 2))
outline = np.vstack([topbottom, xvalues])[:, mask].T
poly = Polygon(outline).convex_hull
poly = poly.simplify(20, preserve_topology=True)
unique_corners = list(set(poly.exterior.coords)) # [(row,col),(row,col),...]
assert len(unique_corners)==4, 'Found %s corner coordinates instead of 4. Exiting.'
getClosest = lambda refR, refC: \
unique_corners[np.array([np.sqrt((refR-c[0])**2 + (refC-c[1])**2) for c in unique_corners]).argmin()]
UL,UR,LL,LR = [getClosest(refR,refC) for refR,refC in [(0,0),(0,cols-1),(rows-1,0),(rows-1,cols-1)]]
return UL, UR, LL, LR
poly = Polygon(outline).convex_hull
poly = poly.simplify(20, preserve_topology=True)
poly = Polygon(list(set(poly.exterior.coords))).convex_hull # eliminiate duplicates except of 1 (because poly is closed)
unique_corn_YX = sorted(set(poly.exterior.coords), key=lambda x: poly.exterior.coords[:].index(x)) # [(rows,cols),rows,cols]
if assert_four_corners:
assert len(unique_corn_YX)==4, 'Found %s corner coordinates instead of 4. Exiting.' %len(unique_corn_YX)
get_dist = lambda P1yx ,P2yx : np.sqrt((P1yx[0]-P2yx[0])**2 + (P1yx[1]-P2yx[1])**2)
getClosest = lambda tgtYX,srcYXs: srcYXs[np.array([get_dist(tgtYX,srcYX) for srcYX in srcYXs]).argmin()]
out = [getClosest(tgtYX,unique_corn_YX) for tgtYX in [(0,0),(0,cols-1),(rows-1,0),(rows-1,cols-1)]]
else:
out = unique_corn_YX
else: # alg='numpy'
cols_containing_data = np.any(mask_1bit, axis=0)
rows_containing_data = np.any(mask_1bit, axis=1)
first_dataCol = list(cols_containing_data).index(True)
last_dataCol = cols - (list(reversed(cols_containing_data)).index(True) + 1)
first_dataRow = list(rows_containing_data).index(True)
last_dataRow = rows - (list(reversed(rows_containing_data)).index(True) + 1)
StartStopRows_in_first_dataCol = [list(mask_1bit[:, first_dataCol]).index(True),
(rows - list(reversed(mask_1bit[:, first_dataCol])).index(True) + 1)]
StartStopRows_in_last_dataCol = [list(mask_1bit[:, last_dataCol]).index(True),
(rows - list(reversed(mask_1bit[:, last_dataCol])).index(True) + 1)]
StartStopCols_in_first_dataRow = [list(mask_1bit[first_dataRow, :]).index(True),
(cols - list(reversed(mask_1bit[first_dataRow, :])).index(True) + 1)]
StartStopCols_in_last_dataRow = [list(mask_1bit[last_dataRow, :]).index(True),
(cols - list(reversed(mask_1bit[last_dataRow, :])).index(True) + 1)]
if True in [abs(np.diff(i)[0]) > 10 for i in
[StartStopRows_in_first_dataCol, StartStopRows_in_last_dataCol, StartStopCols_in_first_dataRow,
StartStopCols_in_last_dataRow]]:
''' In case of cut image corners (e.g. ALOS AVNIR-2 Level-1B2): Calculation of trueDataCornerPos outside of the image.'''
# print ('Image corners detected to be cut.')
def find_line_intersection_point(line1, line2):
xdiff = (line1[0][0] - line1[1][0], line2[0][0] - line2[1][0])
ydiff = (line1[0][1] - line1[1][1], line2[0][1] - line2[1][1])
def det(a, b):
return a[0] * b[1] - a[1] * b[0]
div = det(xdiff, ydiff)
if div == 0:
raise Exception('lines do not intersect')
d = (det(*line1), det(*line2))
x = det(d, xdiff) / div
y = det(d, ydiff) / div
return x, y
line_N = ((StartStopCols_in_first_dataRow[1], first_dataRow),
(last_dataCol, StartStopRows_in_last_dataCol[0]))
line_S = ((first_dataCol, StartStopRows_in_first_dataCol[1]),
(StartStopCols_in_last_dataRow[0], last_dataRow))
line_W = ((StartStopCols_in_first_dataRow[0], first_dataRow),
(first_dataCol, StartStopRows_in_first_dataCol[0]))
line_O = ((last_dataCol, StartStopRows_in_last_dataCol[1]),
(StartStopCols_in_last_dataRow[1], last_dataRow))
corners = [list(reversed(find_line_intersection_point(line_N, line_W))),
list(reversed(find_line_intersection_point(line_N, line_O))),
list(reversed(find_line_intersection_point(line_S, line_W))),
list(reversed(find_line_intersection_point(line_S, line_O)))]
else:
dataRow_in_first_dataCol = np.mean(StartStopRows_in_first_dataCol)
dataRow_in_last_dataCol = np.mean(StartStopRows_in_last_dataCol)
dataCol_in_first_dataRow = np.mean(StartStopCols_in_first_dataRow)
dataCol_in_last_dataRow = np.mean(StartStopCols_in_last_dataRow)
corners = [(first_dataRow, dataCol_in_first_dataRow), (dataRow_in_last_dataCol, last_dataCol),
(last_dataRow, dataCol_in_last_dataRow), (dataRow_in_first_dataCol, first_dataCol)]
getClosest = lambda refR, refC: \
corners[np.array([np.sqrt((refR - c[0]) ** 2 + (refC - c[1]) ** 2) for c in corners]).argmin()]
out = [getClosest(refR, refC) for refR, refC in [(0,0), (0, cols-1), (rows-1, 0), (rows-1, cols-1)]]
# if UL[0] == 0 and UR[0] == 0:
# envi.save_image(os.path.abspath('./testing/out/__faulty_mask_1bit.hdr'), self.mask_1bit, dtype= 'uint8', interleave='bsq', ext='.bsq',force=True)
# print(UL,UR,LL,LR)
return out
def is_granule(trueCornerPos): # TODO
......@@ -3862,4 +3866,26 @@ def get_subsetProps_from_subsetArg(shape_fullArr,subset):
return collections.OrderedDict(zip(
['rows','cols','bands','rowStart','rowEnd','colStart','colEnd','bandStart','bandEnd','bandsList'],
[ rows , cols , bands , rowStart , rowEnd , colStart , colEnd , bandStart , bandEnd , bandsList ]))
\ No newline at end of file
[ rows , cols , bands , rowStart , rowEnd , colStart , colEnd , bandStart , bandEnd , bandsList ]))
def is_point_on_grid(pointXY,xgrid,ygrid):
# type: (tuple,np.array,np.array)
"""Checks if a given point is exactly on the given coordinate grid.
:param pointXY: (X,Y) coordinates of the point to check
:param xgrid: numpy array defining the coordinate grid in x-direction
:param ygrid: numpy array defining the coordinate grid in y-direction
"""
if HLP_F.find_nearest(xgrid,pointXY[0],extrapolate=True)==pointXY[0] and \
HLP_F.find_nearest(ygrid,pointXY[1],extrapolate=True)==pointXY[1]:
return True
else:
return False
def is_coord_grid_equal(gt,xgrid,ygrid):
ULx,xgsd,holder,ULy,holder,ygsd = gt
if is_point_on_grid((ULx,ULy),xgrid,ygrid) and xgsd==abs(xgrid[1]-xgrid[0]) and abs(ygsd)==abs(ygrid[1]-ygrid[0]):
return True
else:
return False
\ No newline at end of file
......@@ -78,7 +78,7 @@ def get_entity_IDs_within_AOI(): # called in console mode
def get_data_list_of_current_jobID(): # called in webapp mode
query = lambda tablename,vals2return,cond_dict,records2fetch=0:\
DB_T.get_info_from_postgreSQLdb(job.conn_database,tablename,vals2return,cond_dict,records2fetch)
resultset = query('scenes_jobs','sceneids',{'id':job.ID},3)
resultset = query('jobs','sceneids',{'id':job.ID},3)
assert len(resultset) != 0, "Invalid jobID given - no corresponding job with the ID=%s found in database.\n" %job.ID
assert len(resultset) == 1, "Error in database. The jobID %s exists more than once. \n" %job.ID
sceneids = resultset[0][0]
......
This diff is collapsed.
This diff is collapsed.
......@@ -143,3 +143,7 @@ class L1C_object(L1B_object):
self.SAA_arr = None # not needed anymore
self.RAA_arr = None # not needed anymore
# delete previous proc_level results
if getattr(job,'exec__%sP'%self.proc_level)[2]:
self.delete_previous_proc_level_results()
......@@ -75,8 +75,9 @@ def get_DESHIFTER_configs(dicts_GMS_obj, attrnames2deshift, proc_bandwise=False,
bands = obj[attrname].shape[2] if len(obj[attrname].shape)==3 else None
elif isinstance(obj[attrname],str):
if os.path.exists(obj[attrname]):
bands = gdal.Open(obj[attrname]).RasterCount
bands = bands if bands > 1 else None
ds = gdal.Open(obj[attrname])
bands = ds.RasterCount if ds.RasterCount > 1 else None
ds = None
else:
warnings.warn('get_DESHIFTER_configs: Missing file %s. File skipped.' % obj[attrname])
continue
......@@ -141,17 +142,21 @@ class DESHIFTER(object):
# unpack args
self.shift_scene_ID = dict_GMS_obj['scene_ID']
self.shift_entity_ID = dict_GMS_obj['entity_ID']
self.shift_dataset_ID = dict_GMS_obj['dataset_ID']
self.shift_prj = dict_GMS_obj['meta']['coordinate system string']
self.shift_gt = GEOP.mapinfo2geotransform(dict_GMS_obj['meta']['map info'])
self.shift_trueDataCornerPos = dict_GMS_obj['trueDataCornerPos']
self.shift_trueDataCornerLonLat = dict_GMS_obj['trueDataCornerLonLat']
self.im2shift = dict_GMS_obj[attrname2deshift] # TODO checken obs funktioniert, weil ja nur dict übergeben wird, statt dem obj
self.im2shift = dict_GMS_obj[attrname2deshift]
self.shift_baseN = dict_GMS_obj['baseN']
self.path_logfile = dict_GMS_obj['path_logfile']
self.shift_shape = self.get_dims_im2shift()
self.ds_im2shift = None # from disk or from L1A object; set by self.get_ds_im2shift() called from self.correct shifts)
self.attrname2deshift = attrname2deshift
self.updated_map_info = dict_GMS_obj['coreg_info']['updated map info']
self.original_map_info= dict_GMS_obj['coreg_info']['updated map info']
self.original_map_info= dict_GMS_obj['coreg_info']['original map info']
self.updated_gt = GEOP.mapinfo2geotransform(self.updated_map_info)
self.ref_scene_ID = dict_GMS_obj['coreg_info']['reference scene ID']
self.ref_entity_ID = dict_GMS_obj['coreg_info']['reference entity ID']
......@@ -161,21 +166,28 @@ class DESHIFTER(object):
self.ref_rows = dict_GMS_obj['coreg_info']['reference extent']['rows']
self.is_shifted = dict_GMS_obj['coreg_info']['is shifted']
self.is_resampled = dict_GMS_obj['coreg_info']['is resampled']
self.x_shift_px = dict_GMS_obj['coreg_info']['corrected_shifts_px']['x']
self.y_shift_px = dict_GMS_obj['coreg_info']['corrected_shifts_px']['y']
self.updated_projection = self.ref_prj
self.deshift_needed = self.x_shift_px or self.y_shift_px
# set the rest
# self.ref_xgsd,self.shift_xgsd = abs(self.ref_gt[1]), abs(self.shift_gt[1])
# self.ref_ygsd,self.shift_ygsd = abs(self.ref_gt[5]), abs(self.shift_gt[5])
# if not isinstance(out_gsd,int) or (isinstance(out_gsd,list) and len(out_gsd)==2): raise ValueError
# self.out_gsd = [self.shift_xgsd, self.shift_ygsd] if out_gsd is None else \
# [self.ref_xgsd, self.ref_ygsd] if self.match_gsd else \
# [out_gsd, out_gsd] if isinstance(out_gsd,int) else out_gsd
self.ref_xgsd,self.shift_xgsd = abs(self.ref_gt[1]), abs(self.shift_gt[1]) # FIXME refgrid muss schon durch config festgelegt werden, nicht durch referenz-Szene
self.ref_ygsd,self.shift_ygsd = abs(self.ref_gt[5]), abs(self.shift_gt[5])
if not isinstance(out_gsd,int) or (isinstance(out_gsd,list) and len(out_gsd)==2): raise ValueError
self.out_gsd = [self.shift_xgsd, self.shift_ygsd] if out_gsd is None else \
[self.ref_xgsd, self.ref_ygsd] if self.match_gsd else \
[out_gsd, out_gsd] if isinstance(out_gsd,int) else out_gsd
self.shift_xgsd = abs(self.shift_gt[1])
self.shift_ygsd = abs(self.shift_gt[5])
if self.deshift_needed:
self.ref_xgsd = abs(self.ref_gt[1]) # FIXME refgrid muss schon durch config festgelegt werden, nicht durch referenz-Szene
self.ref_ygsd = abs(self.ref_gt[5])
if out_gsd and not isinstance(out_gsd,int) or not (isinstance(out_gsd,list) and len(out_gsd)==2):
raise ValueError
#self.out_gsd = [self.shift_xgsd, self.shift_ygsd] if out_gsd is None else \
# [self.ref_xgsd, self.ref_ygsd] if self.match_gsd else \
# [out_gsd, out_gsd] if isinstance(out_gsd,int) else out_gsd
else:
self.align_grids = False # ensures proper processing in self.correct_shifts()
self.out_gsd = usecase.target_gsd
tmpBaseN = '%s__shifted_to__%s' %(dict_GMS_obj['entity_ID'],self.ref_entity_ID)
tempdir = '/dev/shm/GeoMultiSens/' if os.path.exists('/dev/shm/GeoMultiSens/') else\
......@@ -204,7 +216,7 @@ class DESHIFTER(object):
arr_ds = None
gdal.Unlink(path_src)
else:
raise TypeError('DESHIFTER.im2shift must contain a path or a numpy array. Got %s' &type(self.im2shift))
raise TypeError('DESHIFTER.im2shift must contain a path or a numpy array. Got %s' %type(self.im2shift))
return ds
def get_dims_im2shift(self):
......@@ -221,31 +233,41 @@ class DESHIFTER(object):
raise TypeError('DESHIFTER.im2shift must contain a path or a numpy array. Got %s' %type(self.im2shift))
return rows, cols, bands
def correct_shifts(self):
# type: (L2A_P.DESHIFTER) -> collections.OrderedDict
temp_logger = HLP_F.setup_logger('log__' + self.shift_baseN, self.path_logfile, append=1)
t0 = time.time()
equal_prj = GEOP.get_proj4info(proj=self.ref_prj)==GEOP.get_proj4info(proj=self.shift_prj)
equal_prj = GEOP.get_proj4info(proj=self.ref_prj)==GEOP.get_proj4info(proj=self.shift_prj) \
if self.deshift_needed else True
if equal_prj and not self.align_grids and self.out_gsd in [None,[self.shift_xgsd,self.shift_ygsd]]:
"""NO RESAMPLING NEEDED"""
print("DESHIFT: Only the map info of %s has been updated because 'align_grids' is turned off, the pixel "
"sizes keep the same and source and target projections are equal. " %self.shift_entity_ID)
self.is_shifted = True
self.is_resampled = False
if self.cliptoextent:
if self.shift_gt==self.updated_gt:
temp_logger.info("Attribute '%s', band %s has only been clipped to it's extent because no shifts "
"have been detected." %(self.attrname2deshift,self.band2process))
else:
temp_logger.info("Only the map info of %s, attribute '%s', band %s has been updated because "
"'align_grids' is turned off, the pixel sizes keep the same and source and target "
"projections are equal. " %(self.shift_entity_ID,self.attrname2deshift,self.band2process))
# get true bounds in pixel coordinates
df_xmin,df_xmax,df_ymin,df_ymax = 0, self.shift_shape[1], 0, self.shift_shape[0] # defaults
xmin,xmax,ymin,ymax = GEOP.corner_coord_to_minmax([[c[1],c[0]] for c in self.shift_trueDataCornerPos])
xmin,xmax,ymin,ymax = [c if c>=0 else df_c for c,df_c in # replace eventually negative px-coordinates
xmin,xmax,ymin,ymax = [int(c) if c>=0 else int(df_c) for c,df_c in # replace eventually negative px-coordinates
zip([xmin,xmax,ymin,ymax],[df_xmin,df_xmax,df_ymin,df_ymax])]
# overwrite self.arr_shifted with subset read from L1A array
if isinstance(self.im2shift,str): # self.im2shift contains path to disk
tmp_arr = rasterio.open(self.im2shift).read(self.band2process,
window=((ymin,ymax+1), (xmin, xmax+1))) # reads all bands if self.band2process is None
tmp_arr = rasterio.open(self.im2shift).read(self.band2process, window=((ymin,ymax+1),(xmin,xmax+1))) # reads all bands if self.band2process is None
in_arr = np.swapaxes(np.swapaxes(tmp_arr,0,2),0,1) if len(tmp_arr.shape)==3 else tmp_arr
elif isinstance(self.im2shift,np.ndarray): # self.im2shift contains array
in_arr = self.im2shift[ymin:ymax+1, xmin:xmax+1] if len(self.im2shift.shape)==2 else \
......@@ -254,15 +276,16 @@ class DESHIFTER(object):
raise Exception('Unexpected attribute type for attribute DESHIFTER.im2shift.')
self.arr_shifted = in_arr
# update gt
self.updated_projection = self.updated_projection if self.updated_projection else self.shift_prj # self.updated_projection is taken from ref_prj which is None is target image dont has to be coregistered
self.updated_gt[3],self.updated_gt[0] =\
GEOP.pixelToMapYX([xmin,ymin],geotransform=self.updated_gt,projection=self.updated_projection)[0]
self.updated_map_info = GEOP.geotransform2mapinfo(self.updated_gt,self.updated_projection)
else:
pass # FIXME
else: # FIXME equal_prj==False ist noch NICHT implementiert
"""RESAMPLING NEEDED"""
xgsd, ygsd = self.out_gsd
self.updated_gt[1], self.updated_gt[5] = xgsd, -ygsd
if self.cliptoextent:
......@@ -310,8 +333,8 @@ class DESHIFTER(object):
path_src = PG.get_tempfile(ext='.vrt') # FIXME schreibt .vrt in tempfile
in_ds = gdal.GetDriverByName('VRT').CreateCopy(path_src, ds_tmp, 0) # FIXME warum überhaupt noch ein VRT schreiben
in_nodata = get_nodata(in_ds)
in_ds = None
elif isinstance(self.im2shift,np.ndarray): # TODO testen
in_ds,ds_tmp = None
elif isinstance(self.im2shift,np.ndarray):
"""self.im2shift contains array -> gdalwarp input has to come from VRT that can be built from ds.im2shift"""
in_ds = self.get_ds_im2shift()
path_src = in_ds.GetDescription()
......@@ -333,10 +356,10 @@ class DESHIFTER(object):
print('reading from', ds_shifted.GetDescription())
if self.band2process is None:
dim2RowsColsBands = lambda A: np.swapaxes(np.swapaxes(A,0,2),0,1) # rasterio.open(): [bands,rows,cols]
dim2RowsColsBands = lambda A: np.swapaxes(np.swapaxes(A,0,2),0,1) # rasterio.open(): [bands,rows,cols]
self.arr_shifted = dim2RowsColsBands(rasterio.open(self.path_tmp).read())
else:
self.arr_shifted = rasterio.open(self.path_tmp).read(self.band2process)
self.arr_shifted = rasterio.open(self.path_tmp).read(self.band2process)
self.is_shifted = True
self.is_resampled = True
......@@ -362,13 +385,18 @@ class DESHIFTER(object):
# apply XY-shifts to shift_gt
self.shift_gt[0], self.shift_gt[3] = self.updated_gt[0], self.updated_gt[3]
in_nodata = HLP_F.get_outFillZeroSaturated(in_arr.dtype)[0]
# resample cloud mask with nearest neighbour alg. in order to prevent changes in pixel values that represent different object classes
rsp_alg = 2 if self.attrname2deshift not in ['masks','mask_clouds'] else 0
if self.cliptoextent:
out_arr, out_gt, out_prj = GEOP.warp_ndarray(in_arr,self.shift_gt,self.shift_prj,self.ref_prj,
rsp_alg=2, in_nodata=in_nodata, out_nodata=in_nodata, out_res=(xgsd,xgsd),
rsp_alg=rsp_alg, in_nodata=in_nodata, out_nodata=in_nodata, out_res=(xgsd,xgsd),
out_extent=[xmin,ymin,xmax,ymax]) # FIXME resamp_alg: "average" causes sinus-shaped patterns in fft images
else:
out_arr, out_gt, out_prj = GEOP.warp_ndarray(in_arr,self.shift_gt,self.shift_prj,self.ref_prj,
rsp_alg=2, in_nodata=in_nodata, out_nodata=in_nodata, out_res=(xgsd,xgsd), out_gt=out_gt)
rsp_alg=rsp_alg, in_nodata=in_nodata, out_nodata=in_nodata, out_res=(xgsd,xgsd), out_gt=out_gt)
# output array is shifted AND has no odd corner coordinates because it is aligned to reference grid!
self.updated_projection = self.ref_prj
self.arr_shifted = out_arr
......@@ -378,20 +406,23 @@ class DESHIFTER(object):
self.is_resampled = True
self.set_deshift_results()
print('Time for shift correction: %.2fs' %(time.time()-t0))
temp_logger.info("Calculated shift-corrected array for attribute '%s', band %s... %.2fs"
%(self.attrname2deshift, self.band2process,time.time()-t0))
return self.deshift_results
def set_deshift_results(self):
self.deshift_results = collections.OrderedDict()
self.deshift_results.update({'scene_ID' :self.shift_scene_ID})
self.deshift_results.update({'entity_ID' :self.shift_entity_ID})
self.deshift_results.update({'attrname' :self.attrname2deshift})
self.deshift_results.update({'band' :self.band2process})
self.deshift_results.update({'is shifted' :self.is_shifted})
self.deshift_results.update({'is resampled' :self.is_resampled})
self.deshift_results.update({'updated map info' :self.updated_map_info})
self.deshift_results.update({'updated projection':self.updated_projection})
self.deshift_results.update({'arr_shifted' :self.arr_shifted})
self.deshift_results.update({'scene_ID' :self.shift_scene_ID})
self.deshift_results.update({'entity_ID' :self.shift_entity_ID})
self.deshift_results.update({'attrname' :self.attrname2deshift})
self.deshift_results.update({'band' :self.band2process})
self.deshift_results.update({'is shifted' :self.is_shifted})
self.deshift_results.update({'is resampled' :self.is_resampled})
self.deshift_results.update({'updated map info' :self.updated_map_info})
self.deshift_results.update({'updated projection' :self.updated_projection})
self.deshift_results.update({'is clipped to extent' :self.cliptoextent})
self.deshift_results.update({'arr_shifted' :self.arr_shifted})
class L2A_object(L1C_object):
......
......@@ -12,6 +12,7 @@ import numpy as np
from scipy.interpolate import interp1d
from misc import helper_functions as HLP_F
from gms_io import Input_reader as INP_R
from algorithms.L2A_P import L2A_object
usecase = builtins.GMS_config.usecase
......@@ -24,22 +25,29 @@ class L2B_object(L2A_object):
self.logger = HLP_F.setup_logger('log__' + self.baseN, self.path_logfile, append=1) if L2A_obj else None
def spectral_homogenization(self, subset=None, kind='linear'):
src_cwls = self.meta['wavelength']
tgt_cwls = usecase.target_CWL # FIXME exclude or include thermal bands; respect sorted CWLs in context of LayerBandsAssignment
assert kind in ['linear',], "%s is not a supported kind of homogenization." %kind
self.log_for_fullArr_or_firstTile(self.logger,'Performing spectral homogenization (%s) with target wavelength '
'positions at %s nm.' %(kind,', '.join(np.array(tgt_cwls[:-1]).astype(str))+' and %s' %tgt_cwls[-1]))
if isinstance(self.arr,np.ndarray):
if kind=='linear': self.arr = self.interpolate_cube_linear(tgt_cwls)
self.meta['wavelength'] = list(tgt_cwls)
self.meta['bands'] = len(tgt_cwls)
self.meta['LayerBandsAssignment'] = [] # TODO
del self.meta['band names'] # TODO
else:
raise NotImplementedError('got %s' %type(self.arr))
def interpolate_cube_linear(self, target_CWLs):
# type: (list,list) -> np.ndarray
orig_CWLs, target_CWLs = np.array(self.meta['wavelength']), np.array(target_CWLs)
outarr = interp1d(np.array(orig_CWLs), self.arr, axis=2, kind='linear', fill_value='extrapolate')(target_CWLs)
inArray = self.arr if isinstance(self.arr,np.ndarray) else \
INP_R.read_ENVIfile(self.arr, self.arr_shape, self.arr_pos, self.logger, q=1)
self.arr = self.interpolate_cube_linear(inArray,src_cwls,tgt_cwls) if kind=='linear' else self.arr