Welcome to TERN Knowledge Base

Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 4 Next »

List of Layers

uas_surveillance

Workflow description

workflow diagram

All drone data will initially be uploaded by external users to Q4647 held by QRIScloud. The first stage is to make a copy of the data into our mounted NFS storage Q4646. There are two subdirectories in Q4646/archive and Q4646/public. we first use Rclone to keep a copy of everything in Q4646/archive. For Rclone configuration, see /wiki/spaces/TDSAG/pages/2628878424.

Next step is figuring out what should be published and what should be kept in archive only. The rule should be configurable. The data consists of both level0_raw, which is the raw data, and level1_prod, which is the data further processed by external user and is ready for use. level0_raw (along with metadata) will go to Q4646/public, and level1_prod will go to Object Store in nectar.

Currently the folder structure is ...uas/<data provider / project>/<sensor type>/<site>/<plot>/<date>/..., e.g., /uas/surveillance/imagery/Calperum/SASMDD0008/20220519/imagery/multispec/level1_proc/20220519_SASMDD0008_rededgemx_ortho_05_cog.tif

CoG data visualization

Each Cog file will be visualized in geoserver as a single layer. Currently the name of store/layer in geoserver is <site>_<filename>, e.g., Calperum_20220519_SASMDD0008_rededgemx_ortho_05_cog.tif.

LAS data visualization

Each las file will be extracted from zip before converted into copc file, which could be visualized by QGIS. the filename is the same as the zip file, e.g., 20220517_SASMDD0009_lic.copc.laz. Ram-optimized VM needs to be created during dag run for converting las into copc. For VM configuration, see here.

workflow detail

  1. Update Q4646/archive
    in this step we simply use rclone to copy the whole directory, e.g., from Q4647/UTASto /archive/surveillance/uas. The configuration of rclone is mentioned above.

  2. Update Q4646/public
    in this step we copy the metadata and level0_raw data into public

    1. TODO: apply configurable publication rule to decide which files are to be copied.

    2. check whether files are level 0 or metadata

    3. modify the path name according to our folder structure

    4. copy the file

  3. Process each LAS file

    1. create a VM and make it ready fore pdal-translation task.

    2. in the created VM do follows

      1. find zip file in /public contains point cloud data. check whether the copc file exists.

      2. extract las file from zip, translate it into copc file using pdal

      3. upload the copc file to the same directory of zip file

  4. Update object store
    in this step we upload the metadata and level1_prod data into object store

    1. TODO: apply configurable publication rule to decide which files are to be copied.

    2. check whether files are level 1 or metadata

    3. modify the path name according to our folder structure

    4. upload the file

  5. Visualize CoG files

    1. Create store/layer in geoserver for new Cog files

      1. fetch list of all stores within the given workspace and all .tif files in object store.

      2. create store for each .tif file not in geoserver yet. use http request to post xml config for the store.

      3. create layer for each .tif file in the previous step. use http request to post xml config for the layer.

    2. Send the WSM url for the workspace to publish the record in TDDP
      this step is manual and one-off for each dataset. the url is something like https://geoserver-test.tern.org.au/geoserver/uas/wms?service=WMS&version=1.3.0&request=GetCapabilities

  • No labels

0 Comments

You are not logged in. Any changes you make will be marked as anonymous. You may want to Log In if you already have an account.