...
All drone data will initially be uploaded by external users to Q4647
held by QRIScloud. For details on uploading data to Q4647 using GridFTP, see Globus Connect Personal Grid FTP Guide for New User.
The first stage is to make a copy of the data into our mounted NFS storage Q4646
. There are two subdirectories in Q4646/archive
and Q4646/public
. we first use Rclone to keep a copy of everything in Q4646/archive
. For Rclone configuration, see /wiki/spaces/TDSAG/pages/2628878424.
...
Each las file will be extracted from zip before converted into copc file, which could be visualized by QGIS. the filename is the same as the zip file, e.g., 20220517_SASMDD0009_lic.copc.laz
. Ram-optimized VM needs to be created during dag run for converting las into copc. For VM configuration, see here.
workflow detail
Update Q4646/archive
in this step we simply use rclone to copy the whole directory, e.g., fromQ4647/UTAS
to/archive/surveillance/uas
. The configuration of rclone is mentioned above.Update Q4646/public
in this step we copy the metadata and level0_raw data into publicTODO: apply configurable publication rule to decide which files are to be copied.
check whether files are level 0 or metadata
modify the path name according to our folder structure
copy the file
Process each LAS file
create a VM and make it ready fore pdal-translation task. here for more details.
in the created VM do follows
find zip file in /public contains point cloud data. check whether the copc file exists.
extract las file from zip, translate it into copc file using pdal
upload the copc file to the same directory of zip file
Update object store
in this step we upload the metadata and level1_prod data into object storeTODO: apply configurable publication rule to decide which files are to be copied.
check whether files are level 1 or metadata
modify the path name according to our folder structure
upload the file
Visualize CoG files
Create store/layer in geoserver for new Cog files
fetch list of all stores within the given workspace and all .tif files in object store.
create store for each .tif file not in geoserver yet. use http request to post xml config for the store.
create layer for each .tif file in the previous step. use http request to post xml config for the layer.
Send the WSM url for the workspace to publish the record in TDDP
this step is manual and one-off for each dataset. the url is something like https://geoserver-test.tern.org.au/geoserver/uas/wms?service=WMS&version=1.3.0&request=GetCapabilities