geetools.ImageCollectionAccessor.outliers#

geetools.ImageCollectionAccessor.outliers(bands=[], sigma=2, drop=False)#

Compute the outlier for each pixel in the specified bands.

A pixel is considered as an outlier if:

outlier = value > mean+(sigma*stddev)
outlier = value < mean-(sigma*stddev)

In a 1D example it would be: - values = [1, 5, 6, 4, 7, 10] - mean = 5.5 - std dev = 3 - mean + (sigma*stddev) = 8.5 - mean - (sigma*stddev) = 2.5 - outliers = values between 2.5 and 8.5 = [1, 10]

Here in this function an extra band is added to each image for each of the evaluated bands with the outlier status. The band name is the original band name with the suffix “_outlier”. A value of 1 means that the pixel is an outlier, 0 means that it is not.

Optionally users can discard this band by setting drop to True and the outlier will simply be masked from each ilmage. This is useful when the outlier band is not needed and the user wants to save space.

idea from: https://www.kdnuggets.com/2017/02/removing-outliers-standard-deviation-python.html

Parameters:
  • bands (geetools.types.ee_list) – the bands to evaluate for outliers. If empty, all bands are evaluated

  • sigma (geetools.types.ee_number) – the number of standard deviations to use to compute the outlier

  • drop (bool) – whether to drop the outlier band from the images

Returns:

an ImageCollection with the outlier band added to each image or masked if drop is True

Return type:

ee.ImageCollection

Examples

import ee, LDCGEETools

collection = (
    ee.ImageCollection("LANDSAT/LC08/C01/T1_TOA")
    .filterBounds(ee.Geometry.Point(-122.262, 37.8719))
    .filterDate("2014-01-01", "2014-12-31")
)

outliers = collection.ldc.outliers(["B1", "B2"], 2)
print(outliers.getInfo())