Code development platform for open source projects from the European Union institutions :large_blue_circle: EU Login authentication by SMS has been phased out. To see alternatives please check here

Skip to content
Snippets Groups Projects
Select Git revision
  • master
1 result

README.md

Blame
  • Loïc Dutrieux's avatar
    Loïc DUTRIEUX authored
    933cc01e
    History
    Code owners
    Assign users and groups as approvers for specific file changes. Learn more.

    Material for "The forest in a function ..." conference contribution ( BiDS'25)

    This repository provides the code, data, and materials to reproduce the experiments and results presented in our paper:

    Title: The forest in a function: democratizing deep learning for flexible and scalable EO analysis Authors: Loïc Dutrieux, Keith Araño, Pieter Kempeneers Conference: Big Data from Space (BiDS'25)

    This work introduces and utilizes the xinfereo Python package, available at: https://code.europa.eu/jrc-forest/xinfereo

    Repository Contents Overview

    • /src: Python source code for the specific deep learning model (architecture, dataset handlers, losses, transforms) used for the Tree Cover Density (TCD) experiments.
    • /notebooks: Jupyter notebooks for core experimental workflows:
      • 01_training.ipynb: Model training.
      • 02_testing_modalities.ipynb: Evaluation of model performance with different input data scenarios.
      • 03_scalability_test.ipynb: Code for the scalability assessment.
    • /scripts: Utility Python scripts for various stages like AOI sampling, data extraction, normalization parameter generation, ONNX model export, and prediction generation.
    • /data: Example data including Areas of Interest (aois.fgb), model normalization parameters (normalization_parameters.json), and a sample output TCD map (output_tcd_2024_31TFK.tiff).
    • /latex: The LaTeX source and figures for the BiDS'25 manuscript.
    • /condor: HTCondor submission scripts for batch data extraction tasks.

    Execution Environment

    All significant computational workflows (large-scale data extraction, model training, and scalability tests) were performed leveraging the services of the Joint Research Centre's (JRC) Big Data Analytics Platform (BDAP): https://jeodpp.jrc.ec.europa.eu/bdap/.