Extracting Time Series using Google Earth Engine

Time series analysis is one of the most common operations in Remote Sensing. It helps understanding and modeling of seasonal patterns as well as monitoring of land cover changes. Earth Engine is uniquely suited to allow extraction of dense time series over long periods of time.

In this post, I will go through different methods and approaches for time series extraction. While there are plenty of examples available that show how to extract a time series for a single location – there are unique challenges that come up when you need a time series for many locations spanning a large area. I will explain those challenges and present code samples to solve them.

The ultimate goal for this exercise is to extract NDVI time series from Sentinel-2 data over 1 year for 100 farm locations spanning an entire state in India.

Preparing the data

Farm Locations

To make the analysis relevant, we need a dataset of farm locations spanning a large area. If you have your own data, you can upload the shapefile to Earth Engine and use it. But for the purpose of this post, I generated 100 random points. But I wanted those random points to meet certain criteria – such as they should be over farmland growing a certain crop. We can utilize the GFSAD1000: Cropland Extent 1km Crop Dominance, Global Food-Support Analysis Data from the Earth Engine Data Catalog to select all pixels which are farmland growing wheat and rice. The stratifiedSample() method allows us to then generate sample points within those pixels – approximating a dataset of 100 farm locations.

var gaul = ee.FeatureCollection("FAO/GAUL/2015/level1");
var gfsad = ee.Image("USGS/GFSAD1000_V0");
// Select 'landcover' band with pixel values 1 
// which represent Rice and Wheat Rainfed crops
var wheatrice = gfsad.select('landcover').eq(1)
// Uttar Pradesh is a large state in Indo-Gangetic Plain with
// a large agricultural output.
// We use the Global Administrative Unit Layers (GAUL) dataset to get the state boundary
var uttarpradesh = gaul.filter(ee.Filter.eq('ADM1_NAME', 'Uttar Pradesh'))
// wheatrice image contains 1 and 0 pixels. We want to generate points
// only in the pixels that are 1 (representing crop areas)
// selfMask() masks the pixels with 0 value.
var points = wheatrice.selfMask().stratifiedSample({numPoints:100, region:uttarpradesh, geometries: true} )
// We need a unique id for each point. We take the feature id and set it as
// a property so we can refer to each point easily
var points = points.map(function(feature) {
  return ee.Feature(feature.geometry(), {'id': feature.id()})
// Show the state polygon with a blue outline
var outline = ee.Image().byte().paint({
  featureCollection: uttarpradesh,
  color: 1,
  width: 3
Map.addLayer(outline, {palette: ['blue']}, 'AOI')
// Show the farm locations in green
Map.addLayer(points, {color: 'green'}, 'Farm Locations')

Sentinel-2 Image Collection

We will use atmospherically corrected Sentinel-2 Surface Reflectance Data. To use this in our analysis, we should filter the collection to images overlapping with the farm locations and those within the time range. It is also important to apply cloud masking to remove cloudy pixels from the analysis. This part is fairly straightforward where you map functions to remove cloud and add NDVI bands and then filter it down to a date range and location.

// Function to remove cloud and snow pixels
function maskCloudAndShadows(image) {
  var cloudProb = image.select('MSK_CLDPRB');
  var snowProb = image.select('MSK_SNWPRB');
  var cloud = cloudProb.lt(5);
  var snow = snowProb.lt(5);
  var scl = image.select('SCL'); 
  var shadow = scl.eq(3); // 3 = cloud shadow
  var cirrus = scl.eq(10); // 10 = cirrus
  // Cloud probability less than 5% or cloud shadow classification
  var mask = (cloud.and(snow)).and(cirrus.neq(1)).and(shadow.neq(1));
  return image.updateMask(mask);
// Adding a NDVI band
function addNDVI(image) {
  var ndvi = image.normalizedDifference(['B8', 'B4']).rename('ndvi')
  return image.addBands([ndvi])
var startDate = '2019-01-01'
var endDate = '2019-12-31'
// Use Sentinel-2 L2A data - which has better cloud masking
var collection = ee.ImageCollection('COPERNICUS/S2_SR')
    .filterDate(startDate, endDate)
// View the median composite
var vizParams = {bands: ['B4', 'B3', 'B2'], min: 0, max: 2000}
Map.addLayer(collection.median(), vizParams, 'collection')

Get Time Series for a Single Location

At this point, our collection has images spanning a full year. If we wanted to extract NDVI values at any location for the full year, it is quite easy. We can use the built-in charting functions to chart the NDVI value over time.

Our collection has 100 points. We call .first() to get the first point from the collection and create a chart using the ui.Chart.image.series() function. Once you print a chart, you can click the undefined button next to it to get an option to download the data as a CSV.

var testPoint = ee.Feature(points.first())
//Map.centerObject(testPoint, 10)
var chart = ui.Chart.image.series({
    imageCollection: collection.select('ndvi'),
    region: testPoint.geometry()
      interpolateNulls: true,
      lineWidth: 1,
      pointSize: 3,
      title: 'NDVI over Time at a Single Location',
      vAxis: {title: 'NDVI'},
      hAxis: {title: 'Date', format: 'YYYY-MMM', gridlines: {count: 12}}

This is a nice NDVI time-series chart showing the dual-cropping practice common in India.

Getting Time Series for Multiple Locations

Here’s where things start getting complicated. Continuing the charting method above, you may think of using the ui.Chart.image.seriesByRegion()function to get a chart for all 100 points over the year. But you will start hitting the limit of what can be done in Earth Engine’s ‘interactive’ mode.

var chart = ui.Chart.image.seriesByRegion({
    imageCollection: collection.select('ndvi'),
    regions: points,
    reducer: ee.Reducer.mean()
// This doesn't work as the result is to large to print

This is understandable. Earth Engine limits the execution time in the interactive mode to 5 minutes, and times out if your computation takes longer. In such cases, the recommendation is to switch to using the ‘batch’ mode, which has a lot more resources and can run the computation for a long time. The way to use the batch mode is using any of the Export functions.

The method to export a time-series is explained well in this tutorial. The code has a clever way of organizing the results to reduceRegions() into a table that can be exported. This code works when your points do not span a large area. If you tried using this approach for this example, you will run into problems.

Problem 1: Handling Masked Pixels

As we have masked cloudy pixels in source images, those pixels will return null values, resulting in a data gap. As our area spans multiple images, for any given point, majority of the images will not intersect the point and return a null value. We can fix this by assigning a NoData value (such as -9999) to a missing value in the time series. Specifically, we use ee.Reducer.firstNonNull() function to programmatically assign -9999 to any output containing null value. Below is the modified code that generates a table with each point id as the row and NDVI values from each image as columns.

var triplets = collection.map(function(image) {
  return image.select('ndvi').reduceRegions({
    collection: points, 
    reducer: ee.Reducer.first().setOutputs(['ndvi']), 
    scale: 10,
  })// reduceRegion doesn't return any output if the image doesn't intersect
    // with the point or if the image is masked out due to cloud
    // If there was no ndvi value found, we set the ndvi to a NoData value -9999
    .map(function(feature) {
    var ndvi = ee.List([feature.get('ndvi'), -9999])
    return feature.set({'ndvi': ndvi, 'imageID': image.id()})
var format = function(table, rowId, colId) {
  var rows = table.distinct(rowId); 
  var joined = ee.Join.saveAll('matches').apply({
    primary: rows, 
    secondary: table, 
    condition: ee.Filter.equals({
      leftField: rowId, 
      rightField: rowId
  return joined.map(function(row) {
      var values = ee.List(row.get('matches'))
        .map(function(feature) {
          feature = ee.Feature(feature);
          return [feature.get(colId), feature.get('ndvi')];
      return row.select([rowId]).set(ee.Dictionary(values.flatten()));
var sentinelResults = format(triplets, 'id', 'imageID');

Problem 2: Granule Overlaps

The second problem is specific to Sentinel-2 data and how individual images are produced from the raw data. The sentinel data is distributed as granules, also called tiles – which are 100×100 km2 ortho-images. As you can see the map below, there is an overlap between neighboring granules. So the same raw pixel can be in present in upto 4 tiles. And since each granule is processed independently, the output pixel values can be slightly different.

If we exported the table generated in the previous step, we will see multiple NDVI values for the same day which may or may not be the same. For our time series to be consistent, we need to harmonize these overlapping pixels. A decent solution is to take all NDVI values for the same day (generated from the same raw pixels) and assign the maximum of all values to that day. This results in a clean output with 1 NDVI value per point per day.

The following code finds all images of the same day and creates a single output for the day with the maximum of all NDVI values.

// There are multiple image granules for the same date processed from the same orbit
// Granules overlap with each other and since they are processed independently
// the pixel values can differ slightly. So the same pixel can have different NDVI 
// values for the same date from overlapping granules.
// So to simplify the output, we can merge observations for each day
// And take the max ndvi value from overlapping observations
var merge = function(table, rowId) {
  return table.map(function(feature) {
    var id = feature.get(rowId)
    var allKeys = feature.toDictionary().keys().remove(rowId)
    var substrKeys = ee.List(allKeys.map(function(val) { 
        return ee.String(val).slice(0,8)}
    var uniqueKeys = substrKeys.distinct()
    var pairs = uniqueKeys.map(function(key) {
      var matches = feature.toDictionary().select(allKeys.filter(ee.Filter.stringContains('item', key))).values()
      var val = matches.reduce(ee.Reducer.max())
      return [key, val]
    return feature.select([rowId]).set(ee.Dictionary(pairs.flatten()))
var sentinelMerged = merge(sentinelResults, 'id');

The result is a table that can be exported as a CSV file.

    collection: sentinelMerged,
    description: 'NDVI_time_series',
    folder: 'earthengine',
    fileNamePrefix: 'ndvi_time_series',
    fileFormat: 'CSV'

You can see the full script at https://code.earthengine.google.co.in/6c5ed0605c9a66c8f85d6ab0e932a23f

Here is the resulting CSV file.

Hope you found the post useful and got some inspiration to apply it to your own problem. Do leave a comment if you have ideas to improve the code.

If you are new to Earth Engine and want to master it, check out my course Applied Remote Sensing with Google Earth Engine.


Leave a Comment

  1. Hi Ujaval !! Thank you so much for your post. That’s gold to me 🙂

    I have a question: I try to adapt your code to create a .csv table for each band (b2, b3, b4, …)
    When you give the solution for the overlap granules it’s possible that it works for each band too ?
    this part in your code makes me think in the NDVI threshold: // return ee.String(val).slice(0,8)} //


    • Hi Eva – ee.String(val).slice(0,8) is for extracting the date (YYYYMMDD) that is part of the name of the image. So it should work regardless of which band you are extracting.

      To get b1, b2 etc, change the ‘ndvi’ values when you are creating the triplets with ‘b1’, ‘b2’ etc.

  2. Hi Ujaval, thank you very much for this great post ! That’s what I’ve been looking for for a long time.

    I’m doing a similar time series exercice and I have two questions regarding the overlapping granules .

    – Why would you take the maximum of all pixel values to make one single output ? Why not the mean of the pixel values ? Would you say one is more appropriate than the other ?

    – Would you have a simpler code to create a single output for the day with the maximum of all pixel values from an image collection ? I’m not using NDVI nor the code of the probleme 1 because I’m working on water reflectance.

    Thank you in advance for your response.


    • Hi Axelle,

      Glad you found this useful.

      Regarding taking the maximum value, it depends on your application. Mean/Median is fine too.

      For the daily max value – just calling max() on the collection will give you a composite with the maximum value at each pixel from all images. You will need a collection that is filtered by the day. You can use the calendarRange() function to filter the collection. Tell me more about the exact use case and maybe I can give you a code snippet.

      • Hi Ujaval,

        Thank you for your quick reply !

        My goal is to find a correlation between the water quality and the surface reflectance of Sentinel-2 imagery.
        For that, I extracted the pixel value around a point located in the middle of a river with a buffer of 15 meters and I created a time series chart for all the Sentinel-2 bands.

        Here is a part of my code :

        var roi = ee.Geometry.Point(5.068344, 50.492951).buffer(15);

        var s2Col = ee.ImageCollection(‘COPERNICUS/S2_SR’)
        .filterDate(startDate, endDate)
        .select( [‘B1′,’B2′,’B3′,’B4′,’B5′,’B6′,’B7′,’B8′,’B9′,’B8A’,’B11′,’B12′] )

        print( ‘Surface reflectance by band’, ui.Chart.image.series(s2Col, roi, ee.Reducer.mean() ) .setChartType(‘LineChart’));

        I hope that was clear enough.


    • Hi Asmae – The reason you have gaps is because you are showing only a single image on the map. When you iterate over the collection and combine bands of each image in a single image, you get a giant image where the bands are B1, B2, B3, …., B1_1, B2_1, B3_1, …. , B1_2, B2_2, B3_2 .. , when you display your image, you are asking to display only B3, B2, B1 which are from the first image. Since you have applied cloud masking there will be gaps.

      I haven’t seen this type of classification using each image from a time series. Usually you do a composite from a time-range, i.e. a growing season and create composites from that time range and use them for classification. Here’s a code snippet on how to do it.

      // Creating Sentinal 2 based seasonal data to detect phenological differences
      var seasons = ee.List.sequence(1, 12, 3).map(function(month) {
      month = ee.Number(month)

      var collection = s2
      .filterDate(‘2015-01-01’, ‘2018-12-31’)
      .filter(ee.Filter.calendarRange(month, month.add(2), ‘month’))
      // Pre-filter to get less cloudy granules.
      .filter(ee.Filter.lt(‘CLOUDY_PIXEL_PERCENTAGE’, 10))
      // Removing cloudy images
      // Adding indices

      var reducer = ee.Reducer.median()

      var composite = collection.reduce(reducer)
      return composite

      var season_composites = ee.ImageCollection.fromImages(seasons).toBands()
      var composite = composite.addBands(season_composites)

  3. Thank you for your reply,

    I’ve seen your proposition, it’s interesting, but then I realized that I have misexplained my objecting,

    I wanted to make a map to differentiate between crop types in a single year (from the beginning of the agricultural year to its end) using NDVI time series, so the crop would be characterized with its spectral temporal profile, that’s why I’ve made a stuck image using each NDVI band from the time series, and that stuck would be the input for the classification.
    Do you think that idea stands?

    The NDVI bands were also clipped using AOI named domaine

    I have made some modifications and applied 3 classification RF, CART and SVM, but still I couldn’t have a good classification, only CART classification have the frame of AOI domaine, but SVM and FR are cut in half, I realized that came from bands in input that don’t cover the AOI. Maybe should I remove them I don’t know how to do so, or there is another solution?

    The new code with rectifications

    So thankful ,,

    • Since each image doesn’t cover your AOI – you can’t use it to sample training points. To use the spatio-temporal profile of NDVI, you can use monthly or seasonal composites (not individual images) and add them as bands. I realize this is a common question and there is no good example in the user guide. So I will cover this in another blog post soon.

  4. Hi Ujaval! thank you SO much!

    This code is really helpful, it’s all I’ve been looking for haha. Really appreciate your time for explaining it.

  5. Hi Ujaval, really useful post and clearly explained, thanks a lot!
    I do have a question: I pretty new with GEE. I would like to apply this code for a shapefile containing multiple polygons.
    The ‘id’ I would like to use for my crop fields is contained in the feature collection under features properties in the field ‘Name’

    Could you help me understanding how to adapt line 21 in order to make the code work?

    var points = points.map(function(feature) {
    return ee.Feature(feature.geometry(), {‘id’: feature.id()})

    I hope my question is clear enough.
    Thank you some much again for this post, I could learn a lot!

  6. Hi ujaval, thank you very much! Your code is very helpful!
    I have a quick question for you, maskS2clouds removes cloud pixels from images, is there no way to remove the entire image that contains those cloud pixels.
    On my application, I want to generate NDVI time series for polygons that I import on google earth engine and I want to delete each image that contains clouds, basically, I want to keep on my time series only the NDVI values of the images or there are no clouds.

    • That is quite simple, You can filter out images that have more than a certain percentage of cloud cover.

      collection.filter(ee.Filter.lt(‘CLOUDY_PIXEL_PERCENTAGE’, 20))

      If you want images with zero clouds, you will end up with very few images, but give it a try

      collection.filter(ee.Filter.eq(‘CLOUDY_PIXEL_PERCENTAGE’, 0))

    • You can use GAUL which is available in the catalog. But the international boundaries won’t be correct.

      var gaul = ee.FeatureCollection(“FAO/GAUL_SIMPLIFIED_500m/2015/level2”);

  7. Thanks for posting this Ujaval! Your final output is exactly what I’m hoping to generate. However, I modified the code for a Landsat EVI series, and I’m running into a couple of issues. First, in line 67 I’m getting an error that says “EVI is not defined in this scope”, even though it is a band in the image collection. When I comment out the .set command in that line the code runs and I can generate the feature collection.

    Next, when attempting to generate the table to export, I get an error that says “Dictionary: Element at position 0 is not a string.” Any idea why I’m getting these errors?

    Code is available here: https://code.earthengine.google.com/863f733d4d87702e11b3e9ce3cc140f0

    I’m a novice GEE user, so any help you can offer is much appreciated.

      • That did the trick! Thanks for taking the time to look at what was ultimately a very basic mistake on my end.

      • Hi Ujaval,
        Thanks for the fixed code — I was also looking for something like this!
        On running the code (including the commented sections) I get a “Collection query aborted after accumulating over 5000 elements” error. Could you point me towards why this happens, and whether is prevents the eventual export to CSV? How would I fix it?

      • Hi Ujaval,
        Thanks for the fixed code — I was also looking for something like this!
        On running the code (including the commented sections) I get a “Collection query aborted after accumulating over 5000 elements” error. Could you point me towards why this happens, and whether is prevents the eventual export to CSV? How would I fix it?

  8. Hi Ujaval, many thanks. You helped me a lot. But I have another question:

    I’m trying to retrieve all bands’ values of my sample points, and these points are divided into two classes(0 and 1). So I used ui.Chart.image.regions to meet my need, but it warned:
    Error generating chart: Data column(s) for axis #0 cannot be of type string.
    Here is my link: https://code.earthengine.google.com/5218af79c0450c0f2e3f0cf41ffa8f46?accept_repo=users%2Fgorelick%2FEE102
    Do you have any suggestions on how to solve the problem?

  9. Hello Ujaval


    Thanks for this tutorial, it is really helpful.

    I am trying to run this task on MODIS NDVI product.

    Whenever I choose the feature collection given in the mentioned presentation, it works fine.

    But when I insert my own feature collection which is a Grid file then the code works fine in the console but it is unable to export into CSV.
    It is giving an error ‘Error: Error in map(ID=2018_04_07_00000000000000000023): Dictionary: Element at position 0 is not a string.’

    Here is the link to my code & asset:-



  10. Hallo Ujaval,

    I am trying to map flood events using google earth engine and I am supposed to use the expression below to mask water and make a time series for water area coverage. How can I include this in the code to get it running? The link code is also below:

    var indices = ee.Image.cat(

    image.expression(“(b(4) – b(3)) / (b(4) + b(3))”).rename(“ndvi”),

    image.expression(“(b(2) – b(4)) / (b(2) + b(4))”).rename(“ndwi”),

    image.expression(“(b(2) – b(5)) / (b(2) + b(5))”).rename(“mndwi”));

    var water_mask = indices.expression(‘b(“ndvi”) 0 && b(“mndwi”) > 0.5’).rename(“water_mask”);

    Below is the link code:


    Thanks alot


  11. Hello Ujaval

    Thanks for this tutorial, it is really helpful.

    I have a question about the date.

    I modified the date to two years, but it doesn’t work, it still calculated the NDVI in one year.

    I’m a novice GEE user, so any help you can offer is much appreciated.

    • Hope it helps. Try change ‘COPERNICUS/S2_SR’ to ‘COPERNICUS/S2’. It will use 1C level data instead of 2A level data. If it’s possible you should not assign too long time, it may get error from computation timed out.

  12. I want to use specific point so I upload my map coordinates (Longitude,Latitude) csv file and import it as table. I try to change points variable from your code. My specific points are just about 100 points and its date between 1 month. It’s not work as task’s not finished. I want to know how to use own specific point from your code. I try to define ‘raw_points’ to use instead of ‘points’ in your code. Here my code. (some variables are not used) https://code.earthengine.google.com/cf890f86f22e5332e189b49285c72e49

  13. Hello sir,
    Sir ,i am trying to develop real time land usage monitoring tool using satellite data and artificial intelligence.
    How can i start with it.
    I am unable to understand where to start from.

  14. Hi Ujaval: Thank you so much for this detailed instruction, it has made my time series to R so much easier!

    I was modifying the code to using Landsat SR–everything goes well except the final “granule overlap” part. I didn’t change anything of the code, other than replaced sentinel by landsat. But, Instead of getting what you have (74 properties from different date for each feature), I only got 2 properties for each feature, “id” and “LC08_124”, which seems like an aggregated value.

    I am wondering if you have any suggestions if the last part were to applied to Landsat images? Is there change I might need to be aware of ?


  15. Hi Ujaval, thanks for sharing the tips on how to get a time-series.

    In my work, I aim to get a time-series for a few environmental variables that I have chosen. What I did was to get images within a year, and then applied a mean, to get the annual mean value per pixel. However, after I applied reduceRegions on the image, the mean value did not appear in the output. I have tried across the chose variables and I get the same results. Could it be due to a scaling issue or is there an issue I am not aware of?

    Link: https://code.earthengine.google.com/81f6d0123ccbd135de419711341fbdfa

    PS: Sorry if the script is a bit dense, I’m pretty new to Earth Engine and I wasn’t to comfortable with mapping functions across collections

  16. Hi ujaval,

    Very nice tutorial 🙂

    I would like to know if it is possible to export the tiff images from all the timestamps

    Thank you for your help

  17. Hello Sir,

    Thank you so much for the tutorial. I have a question related to averaging each polygon in a feature collection.

    I am working on yield prediction using machine learning techniques in GEE. For that, I have multiple polygons representing the crop fields. I want to get the mean vegetation indices for each polygon in the feature collection. When I applied the mean function, I got a single mean value for all features. Secondly, when I divided the available features into training and testing data, pixel values are taken instead of the polygon.

    I would be most grateful if you could give me any suggestions.

  18. Hello Sir,

    Thank you so much for the tutorial. I have a question related to averaging each polygon in a feature collection.

    I am working on yield prediction using machine learning techniques in GEE. For that, I have multiple polygons representing the crop fields. I want to get the mean vegetation indices for each polygon in the feature collection. When I applied the mean function, I got a single mean value for all features. Secondly, when I divided the available features into training and testing data, pixel values are taken instead of the polygon.

    I would be most grateful if you could give me any suggestions.

  19. Hi Ujaval!

    Really useful article it helps me a lot for the phenology analysis and identify the planting and growing seasons to make the composite. I would like to ask you another problem I am facing now.

    I would like to get n random point within a featurecollection that contain many polygons, so if a have featurecollection with 2 polygon and I want 20 random point I would like to have 40 random points, which means 20 for the first polygon and 20 for the second one.

    I have tried the ee.FeatureCollection.randomPoints which works very good when the featurecollection only have one polygon but when I add an extra polygon it fails and retrieve the number of polygons and not the 20 point for region1 and 20 for region2, any suggestion about it ? And/or how to map this function along each element of the featurecollection ? Below the code I have tried. And thank you so much in advance for you help!

    //This works without problem 🙂

    var region1= ee.Geometry.Rectangle(-0.289, 39.165,-0.297 ,39.170);

    // Create 20 random points in the region.
    var randomPoints = ee.FeatureCollection.randomPoints(region1,20);

    // response after run randomPoints size = 20

    // Add to map
    Map.addLayer(randomPoints, {}, ‘random points’);

    // add extra polygon and create featurecollection Fails,

    var region2 = ee.Geometry.Rectangle(-119.224, 34.669, -99.536, 50.064);

    var poly = we.FeatureCollection([region1,region2])

    var NewrandomPoints = ee.FeatureCollection.randomPoints(poly,20);

    // response after run NewrandomPoints size = 2. 🙁

    Thanks a lot!

  20. Hi Ujaval.
    Great tutorial! I’m writing it in Python to see how it works using the Python API. So far, very well.

    Just a question… Is there a way to “re-arrange” the final table? My goal is to have a table with this columns: Feature_ID (since I’m working with several points), Date (should be the date of the image) and NDVI value.

    As in this tutorial, I need to extract NDVI values from different points in a Feature Collection and export a table that has also the date of the images from the Time Series.

    Thank you in advance!

    PD: I’ll be glad to share de Python version of the tutorial once I’m done!

    • Hi. If you want to export the table with just 3 columns, you can export thee triplets collection. All the work after that is to merge those rows those into a single row per feature.

      Please do share a github link to your Python implementation. Will be happy to link from this post.

Leave a Reply