Customer Success Stories

How Eurocontrol Uses Lightly to Improve Contrail Detection from Satellite Imagery

Lightly helped EUROCONTROL improve contrail segmentation accuracy by enabling domain-specific pretraining and distillation workflows on ash-RGB satellite imagery.

Ana-Maria Pelin
ML Trainee
Overview

Lightly helped EUROCONTROL improve contrail segmentation accuracy by enabling domain-specific pretraining and distillation workflows on ash-RGB satellite imagery.

Industry
Aviation
Location
Brussels, Belgium
Employee
>1000

Get Started with Lightly

Talk to Lightly’s computer vision team about your use case.
Book a Demo
Products
LightlyTrain
Results
Improved performance YOLO-based models
Use Case
Satellite contrail detection & segmentation

About

Eurocontrol’s aviation sustainability unit (ASU) develops AI systems aimed at monitoring and mitigating non-CO2 emissions from European air traffic. Contrails, thin ice-crystal clouds formed behind aircraft, account for an estimated 30% of aviation’s non-CO2 climate impact. Improving the ability to detect and segment contrails from satellite observations may support future route-optimization strategies.

The ASU team, composed of three data scientists, primarily works with GOES and MTG ash-RGB imagery, a modality that differs substantially from natural-image RGB and presents domain-shift challenges for standard vision models.

Problem

Eurocontrol evaluated several off-the-shelf architectures, including YOLO, RF-DETR, and SAM to detect and segment contrails. Early experiments showed that performance varied significantly across models, with some struggling to generalize well on ash-RGB data.

Contributing factors included:

• Limited utility of standard pretrained weights due to modality mismatch
• A relatively small dataset (~20,000 images) with strong class imbalance (many samples contain no contrails)
• Sparse segmentation label distribution and a high proportion of negative samples

These constraints motivated the team to explore alternative strategies for representation learning, pretraining, and model distillation, to improve scalability and develop a reproducible engineering workflow.

Testimonials

“The pretrained models were low in performance. The color scheme is probably the reason, they just don’t transfer well to ash-RGB. This is why we decided to give LightlyTrain distillation a try.”

Ana-Maria Pelin

ML Trainee

Solution

ASU team tested a range of techniques to improve segmentation robustness, including:

1. Domain-specific pretraining on satellite imagery.
2. Use of unlabeled data to compensate for limited annotated samples.
3. Model distillation pipelines with LightlyTrain.

LightlyTrain was used to implement reproducible distillation workflows compatible with models such as YOLO and RF-DETR. According to the ASU team, the framework helped unify augmentation and training configurations and simplified debugging efforts across architectures.

They also highlighted that the documentation and implementation were straightforward. At the same time, the team continued evaluating other pretraining and segmentation strategies to determine the relative impact of different components in the pipeline.

Results

Early experiments demonstrated improvements across segmentation models:

• Distillation improved performance for YOLO-based models, with increases in true positives and upward shifts in metrics such as mAP and DICE.
• Gains were observed even when pretraining and fine-tuning used the same dataset, though the magnitude of improvement varied.
• The ability to mix labeled and unlabeled data helped reduce earlier bottlenecks, but the degree to which this contributed relative to architectural changes is still being quantified.

ASU team is currently preparing more formal benchmarks comparing off-the-shelf models, their custom pretraining strategies, and the distillation workflows developed using LightlyTrain.

As Ana summarised, “We saw an increase in true positives. Once the numbers are ready, we expect the improvements achieved through distillation via LightlyTrain to be significant,” though the team notes that the final evaluation will depend on comprehensive comparisons.

Outlook

The project is ongoing. Future work includes:

• Releasing benchmark results across multiple architectures and training
strategies
• Assessing the generalization of distilled models on new ash-RGB data sources
• Quantifying the relative benefits of unlabeled-data usage, domain-specific pretraining, and distillation frameworks

While early findings indicate promising improvements, Eurocontrol emphasizes that the conclusions will rely on the forthcoming quantitative benchmarks rather than anecdotal trends.

Get Started with Lightly

Talk to Lightly’s computer vision team about your use case.
Book a Demo
Testimonials

What engineers say after adopting Lightly

No fluff—just results from teams using Lightly to move faster with better data and models.

"We had millions of images but no clear way to prioritize. Manual selection was slow and full of guesswork. With Lightly, we just feed in the data and get back what’s actually worth labeling."

Carlos Alvarez
Machine Learning Engineer

We collect millions of road surface images, but since surface imagery is highly spatially correlated, labelling every sample is redundant, and finding sets of diverse data was a challenge.

Vijay Gill Hansted
Machine Learning Engineer

"Through this collaboration, SDSC and Lightly have combined their expertise to revolutionize the process of frame selection in surgical videos, making it more efficient and accurate than ever before to find the best subset of frames for labeling and model training."

Margaux Masson-Forsythe
Director of Machine Learning

“Lightly enabled us to improve our ML data pipeline in all regards: Selection, Efficiency, and Functionality. This allowed us to cut customer onboarding time by 50% while achieving better model performance.”

Harishma Dayanidhi
Co-Founder/ VP of Engineering

"It took far less work than expected to plug DINO into our SSL system - the LightlySSL code was clean and easy to adapt"

Suraj Pai
Research Associate

“By integrating Lightly into our existing workflow, we achieved a 90% reduction in dataset size and doubled the efficiency of our deployment process. The tool’s seamless implementation significantly enhanced our data pipeline.”

Usman Khan
Sr. Data Scientist

Explore Lightly Products

LightlyStudio

Data Curation & Labeling

Curate, label and manage your data
in one place

Learn More

LightlyTrain

Self-Supervised Pretraining

Leverage self-supervised learning to pretrain models

Learn More

LightlyEdge

Smart Data Capturing on Device

Find only the most valuable data directly on device

Learn More

Ready to Get Started?

Experience the power of automated data curation with Lightly

Book a Demo

Get Beyond ImageNet: Vision Model Pretraining for Real-World Tasks.

See benchmarks comparing real-world pretraining strategies inside. No fluff.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.