Time series analysis is one of the most common operations in Remote Sensing. It helps understanding and modeling of seasonal patterns as well as monitoring of land cover changes. Earth Engine is uniquely suited to allow extraction of dense time series over long periods of time.
In this post, I will go through different methods and approaches for time series extraction. While there are plenty of examples available that show how to extract a time series for a single location – there are unique challenges that come up when you need a time series for many locations spanning a large area. I will explain those challenges and present code samples to solve them.
The ultimate goal for this exercise is to extract NDVI time series from Sentinel-2 data over 1 year for 100 farm locations spanning an entire state in India.
Prefer a video-based guide?
Check out these series of videos that gives you a step-by-step instructions and explain the process of extracting NDVI time series in Earth Engine using MODIS data.
Preparing the data
Farm Locations
To make the analysis relevant, we need a dataset of farm locations spanning a large area. If you have your own data, you can upload the shapefile to Earth Engine and use it. But for the purpose of this post, I generated 100 random points. But I wanted those random points to meet certain criteria – such as they should be over farmland growing a certain crop. We can utilize the GFSAD1000: Cropland Extent 1km Crop Dominance, Global Food-Support Analysis Data from the Earth Engine Data Catalog to select all pixels which are farmland growing wheat and rice. The stratifiedSample()
method allows us to then generate sample points within those pixels – approximating a dataset of 100 farm locations.
var gaul = ee.FeatureCollection("FAO/GAUL/2015/level1");
var gfsad = ee.Image("USGS/GFSAD1000_V0");
// Select 'landcover' band with pixel values 1
// which represent Rice and Wheat Rainfed crops
var wheatrice = gfsad.select('landcover').eq(1)
// Uttar Pradesh is a large state in Indo-Gangetic Plain with
// a large agricultural output.
// We use the Global Administrative Unit Layers (GAUL) dataset to get the state boundary
var uttarpradesh = gaul.filter(ee.Filter.eq('ADM1_NAME', 'Uttar Pradesh'))
// wheatrice image contains 1 and 0 pixels. We want to generate points
// only in the pixels that are 1 (representing crop areas)
// selfMask() masks the pixels with 0 value.
var points = wheatrice.selfMask().stratifiedSample({numPoints:100, region:uttarpradesh, geometries: true} )
// We need a unique id for each point. We take the feature id and set it as
// a property so we can refer to each point easily
var points = points.map(function(feature) {
return ee.Feature(feature.geometry(), {'id': feature.id()})
})
// Show the state polygon with a blue outline
var outline = ee.Image().byte().paint({
featureCollection: uttarpradesh,
color: 1,
width: 3
});
Map.addLayer(outline, {palette: ['blue']}, 'AOI')
// Show the farm locations in green
Map.addLayer(points, {color: 'green'}, 'Farm Locations')

Sentinel-2 Image Collection
We will use atmospherically corrected Sentinel-2 Surface Reflectance Data. To use this in our analysis, we should filter the collection to images overlapping with the farm locations and those within the time range. It is also important to apply cloud masking to remove cloudy pixels from the analysis. This part is fairly straightforward where you map functions to remove cloud and add NDVI bands and then filter it down to a date range and location.
// Function to remove cloud and snow pixels
function maskCloudAndShadows(image) {
var cloudProb = image.select('MSK_CLDPRB');
var snowProb = image.select('MSK_SNWPRB');
var cloud = cloudProb.lt(5);
var snow = snowProb.lt(5);
var scl = image.select('SCL');
var shadow = scl.eq(3); // 3 = cloud shadow
var cirrus = scl.eq(10); // 10 = cirrus
// Cloud probability less than 5% or cloud shadow classification
var mask = (cloud.and(snow)).and(cirrus.neq(1)).and(shadow.neq(1));
return image.updateMask(mask);
}
// Adding a NDVI band
function addNDVI(image) {
var ndvi = image.normalizedDifference(['B8', 'B4']).rename('ndvi')
return image.addBands([ndvi])
}
var startDate = '2019-01-01'
var endDate = '2019-12-31'
// Use Sentinel-2 L2A data - which has better cloud masking
var collection = ee.ImageCollection('COPERNICUS/S2_SR')
.filterDate(startDate, endDate)
.map(maskCloudAndShadows)
.map(addNDVI)
.filter(ee.Filter.bounds(points))
// View the median composite
var vizParams = {bands: ['B4', 'B3', 'B2'], min: 0, max: 2000}
Map.addLayer(collection.median(), vizParams, 'collection')

Get Time Series for a Single Location
At this point, our collection has images spanning a full year. If we wanted to extract NDVI values at any location for the full year, it is quite easy. We can use the built-in charting functions to chart the NDVI value over time.
Our collection has 100 points. We call .first(
) to get the first point from the collection and create a chart using the ui.Chart.image.series()
function. Once you print a chart, you can click the button next to it to get an option to download the data as a CSV.
var testPoint = ee.Feature(points.first())
//Map.centerObject(testPoint, 10)
var chart = ui.Chart.image.series({
imageCollection: collection.select('ndvi'),
region: testPoint.geometry()
}).setOptions({
interpolateNulls: true,
lineWidth: 1,
pointSize: 3,
title: 'NDVI over Time at a Single Location',
vAxis: {title: 'NDVI'},
hAxis: {title: 'Date', format: 'YYYY-MMM', gridlines: {count: 12}}
})
print(chart)

This is a nice NDVI time-series chart showing the dual-cropping practice common in India.
Exporting Time Series for A Single Location/Region
If you want a time-series over a polygon, the above technique still works. But if the region is large and your time series is long – you may still run into ‘Computation Time Out’ errors. In that case, we can Export the results as a CSV. We can use the reduceRegion()
function to get the NDVI value from an image. Since we want to do that for all images in the collection, we need to map() a function
var filteredCollection = collection.select('ndvi')
.filter(ee.Filter.bounds(testPoint.geometry()))
var timeSeries = ee.FeatureCollection(filteredCollection.map(function(image) {
var stats = image.reduceRegion({
reducer: ee.Reducer.mean(),
geometry: testPoint.geometry(),
scale: 10,
maxPixels: 1e10
})
// reduceRegion doesn't return any output if the image doesn't intersect
// with the point or if the image is masked out due to cloud
// If there was no ndvi value found, we set the ndvi to a NoData value -9999
var ndvi = ee.List([stats.get('ndvi'), -9999])
.reduce(ee.Reducer.firstNonNull())
// Create a feature with null geometry and NDVI value and date as properties
var f = ee.Feature(null, {'ndvi': ndvi,
'date': ee.Date(image.get('system:time_start')).format('YYYY-MM-dd')})
return f
}))
// Check the results
print(timeSeries.first())
// Export to CSV
Export.table.toDrive({
collection: timeSeries,
description: 'Single_Location_NDVI_time_series',
folder: 'earthengine',
fileNamePrefix: 'ndvi_time_series_single',
fileFormat: 'CSV'
})
Getting Time Series for Multiple Locations
While you can chart or export time series for a single location as shown above, things start getting a bit more complex when you want to do the same for many locations. Continuing the charting method above, you may think of using the ui.Chart.image.seriesByRegion()
function to get a chart for all 100 points over the year. But you will start hitting the limit of what can be done in Earth Engine’s ‘interactive’ mode.
var chart = ui.Chart.image.seriesByRegion({
imageCollection: collection.select('ndvi'),
regions: points,
reducer: ee.Reducer.mean()
})
// This doesn't work as the result is to large to print
print(chart)

This is understandable. Earth Engine limits the execution time in the interactive mode to 5 minutes, and times out if your computation takes longer. In such cases, the recommendation is to switch to using the ‘batch’ mode, which has a lot more resources and can run the computation for a long time. The way to use the batch mode is using any of the Export
functions.
The method to export a time-series is explained well in this tutorial. The code has a clever way of organizing the results to reduceRegions()
into a table that can be exported. This code works when your points do not span a large area. If you tried using this approach for this example, you will run into problems.
Problem 1: Handling Masked Pixels
As we have masked cloudy pixels in source images, those pixels will return null values, resulting in a data gap. As our area spans multiple images, for any given point, majority of the images will not intersect the point and return a null value. We can fix this by assigning a NoData value (such as -9999) to a missing value in the time series. Specifically, we use the ee.Reducer.firstNonNull() function to programmatically assign -9999 to any output containing a null value. Below is the modified code that generates a table with each point id as the row and NDVI values from each date as columns.
var triplets = collection.map(function(image) {
return image.select('ndvi').reduceRegions({
collection: points,
reducer: ee.Reducer.mean().setOutputs(['ndvi']),
scale: 10,
})// reduceRegion doesn't return any output if the image doesn't intersect
// with the point or if the image is masked out due to cloud
// If there was no ndvi value found, we set the ndvi to a NoData value -9999
.map(function(feature) {
var ndvi = ee.List([feature.get('ndvi'), -9999])
.reduce(ee.Reducer.firstNonNull())
return feature.set({'ndvi': ndvi, 'imageID': image.id()})
})
}).flatten();
The triplets variable contains a tall table containing 1 row per date per farm. This table is suitable for further processing in a GIS or statistical analysis. If you require such an output, we can go ahead, set a ‘date’ property and Export this table.
// The result is a 'tall' table. We can further process it to
// extract the date from the imageID property.
var tripletsWithDate = triplets.map(function(f) {
var imageID = f.get('imageID');
var date = ee.String(imageID).slice(0,8);
return f.set('date', date)
})
// For a cleaner table, we can also filter out
// null values, remove duplicates and sort the table
// before exporting.
var tripletsFiltered = tripletsWithDate
.filter(ee.Filter.neq('ndvi', -9999))
.distinct(['id', 'date'])
.sort('id');
// We can export this tall table.
// Specify the columns that we want to export
Export.table.toDrive({
collection: tripletsFiltered,
description: 'Multiple_Locations_NDVI_time_series_Tall',
folder: 'earthengine',
fileNamePrefix: 'ndvi_time_series_multiple_tall',
fileFormat: 'CSV',
selectors: ['id', 'date', 'ndvi']
})
Some applications will require a wide table with 1 row per form containing all observations. We can write a format()
function that turns triplets into a wide table.
var format = function(table, rowId, colId) {
var rows = table.distinct(rowId);
var joined = ee.Join.saveAll('matches').apply({
primary: rows,
secondary: table,
condition: ee.Filter.equals({
leftField: rowId,
rightField: rowId
})
});
return joined.map(function(row) {
var values = ee.List(row.get('matches'))
.map(function(feature) {
feature = ee.Feature(feature);
return [feature.get(colId), feature.get('ndvi')];
});
return row.select([rowId]).set(ee.Dictionary(values.flatten()));
});
};
var sentinelResults = format(triplets, 'id', 'imageID');
Problem 2: Granule Overlaps
The second problem is specific to Sentinel-2 data and how individual images are produced from the raw data.. If you are working with any other dataset (Landsat, MODIS etc.), skip this step and Export the collection generated in the previous step.
The sentinel data is distributed as granules, also called tiles – which are 100×100 km2 ortho-images. As you can see in the map below, there is an overlap between neighboring granules. So the same raw pixel can be present in up to 4 tiles. And since each granule is processed independently, the output pixel values can be slightly different.

If we exported the table generated in the previous step, we will see multiple NDVI values for the same day which may or may not be the same. For our time series to be consistent, we need to harmonize these overlapping pixels. When exporting the tall table, we used the distinct()
function which picked the first of the duplicate values. A better solution is to take all NDVI values for the same day (generated from the same raw pixels) and assign the maximum of all values to that day. This results in a clean output with 1 NDVI value per point per day.
The following code finds all images of the same day and creates a single output for the day with the maximum of all NDVI values.
// There are multiple image granules for the same date processed from the same orbit
// Granules overlap with each other and since they are processed independently
// the pixel values can differ slightly. So the same pixel can have different NDVI
// values for the same date from overlapping granules.
// So to simplify the output, we can merge observations for each day
// And take the max ndvi value from overlapping observations
var merge = function(table, rowId) {
return table.map(function(feature) {
var id = feature.get(rowId)
var allKeys = feature.toDictionary().keys().remove(rowId)
var substrKeys = ee.List(allKeys.map(function(val) {
return ee.String(val).slice(0,8)}
))
var uniqueKeys = substrKeys.distinct()
var pairs = uniqueKeys.map(function(key) {
var matches = feature.toDictionary().select(allKeys.filter(ee.Filter.stringContains('item', key))).values()
var val = matches.reduce(ee.Reducer.max())
return [key, val]
})
return feature.select([rowId]).set(ee.Dictionary(pairs.flatten()))
})
}
var sentinelMerged = merge(sentinelResults, 'id');
Exporting the Time Series
The collection now contains formatted output. It can be exported as a CSV file. Running the code below will create an Export task. Click Run, confirm the parameters and start the task. Once the export task finishes, you will have the CSV file in your Google Drive.
Export.table.toDrive({
collection: sentinelMerged,
description: 'Multiple_Locations_NDVI_time_series_Wide',
folder: 'earthengine',
fileNamePrefix: 'ndvi_time_series_multiple_wide',
fileFormat: 'CSV'
})
You can see the full script at https://code.earthengine.google.co.in/24c0b6b1a8004a6cd7f43924dcd5cb05
Here is the resulting CSV file.

Hope you found the post useful and got some inspiration to apply it to your own problem. Do leave a comment if you have ideas to improve the code.
If you are new to Earth Engine and want to master it, check out my course End-to-End Google Earth Engine.
Hi, thanks a lot for your work, it helps me a lot in writing my bachelor thesis.
I need help with the date format:
When I export NDVI for multiple points, the date format doesn’t suit me. Is there any way to format them? Instead of 20190101, I would need 2019-01-01 (yyyy-mm-dd).
I’ve tried to figure it out myself, but I’m a beginner and can’t figure it out.
Thanks again, have a great day!
https://code.earthengine.google.com/?scriptPath=users%2Fabdoelhmdi%2FNDVI_CODE%3ANDVI_CODE
Could you tell me please, what is the problem in the script above . the final result is null why ?????
“Cannot load script. Script “NDVI_CODE” does not exist in repository “users/abdoelhmdi/NDVI_CODE”.
You need to set up script sharing or check the url is correct.
Thank you very much for your excellent coding. sir how can we implement it for specific study area, what should be eefeaturecollection and eeImage data. pl can u suggest this sir?
You can upload a shapefile of your study area and use it in the script. See more about upload process here https://courses.spatialthoughts.com/end-to-end-gee.html#importing-data
Thank you so much Ujaval, absolutely great post!
I wanted to ask you how I can use the average and not the maximum of the “doubled” daily values?
If I change:
“var val = matches.reduce(ee.Reducer.max())”
to
“var val = matches.reduce(ee.Reducer.mean())”, the value is calculated using -9999 (at least I think, because I get minus values).
my code: https://code.earthengine.google.com/29a00cd6f090b258d73a3a99e39ff72d
Thanks a lot again, hats off to you for your work.
Thank you. You can try the following to remove the -9999 before calculating the mean
var val = matches.removeAll(-9999).reduce(ee.Reducer.mean())
Let me know if that worked.
Thanks for answer. Now I tried it and I get the error: Invalid argument specified for ee.List(): -9999.
I tried replacing the original code:
var ndvi = ee.List([feature.get(‘ndvi’), -9999]) »»» I replaced the -9999 value with Null, but then it seems to miss some days in the export.
Any ideas to solve the problem? Thanks a lot!
any ideas, please? I’m desperate :))
My bad. The removeAll() function takes a list, so the correct code is below
var val = matches.removeAll([-9999]).reduce(ee.Reducer.mean())
I love you, Ujaval! It works as you write. Thank you so much, you helped me a lot!
Thank you so much Ujaval, absolutely great post.
I wanted to ask you how I can use the average and not the maximum of the “doubled” daily values?
If I change “var val = matches.reduce(ee.Reducer.max())” to “var val = matches.reduce(ee.Reducer.mean())”, the value is calculated using -9999 (at least I think, because I get minus values).
my code: https://code.earthengine.google.com/29a00cd6f090b258d73a3a99e39ff72d
Thanks a lot again, hats off to you for your work.
This is a great tutorial and thank you very much for the thorough step by step guide. I am getting an error while exporting the data and am unable to interpret it. I will be grateful if you please help.
Error: Error in map(ID=2020_01_01_00000000000000000113): Number.format: Parameter ‘number’ is required.
I had the same problem. Try having a look at the name of the ID you are referring in this line:
var timeseries = format(triplets, ‘farm_id’, ‘ImageId’)
My mistake was to write “imageId” instead of “ImageId”.
Thanks for sharing this tutorial it has been super helpful! Rather than generate points within GEE I uploaded my points of interest from a csv file. In my csv file I have a point id column. Is it possible in line 18 to specify the value GEE recognizes as the id to the column in my csv file rather than the feature id? In other words instead of :
{‘id’: feature.id()}
Can I do this?
{‘id’: “point id from my csv file”}
Hopefully this question makes sense and I appreciate any advice folks can offer.
Very great work Ujaval ji I am one of your fan . I just want to know If I have more than 100 farm in shape or Kml format then how can I extract in CSV . 2nd question if I have lat long of 100 farm in excel form and i want extract that all lat long NDVI values how can I do ?
Convert your KML to Shapefile and upload it to Earth Engine. See the upload workflow at https://courses.spatialthoughts.com/end-to-end-gee.html#importing-data
Similarly, create a shapefile from the Excel file in QGIS and upload to GEE. (Drag and drop Excel file in QGIS and use ‘Create Point Layer from Table’ tool from Processing Toolbox. Save it as a shapefile)
Time saving! appreciate your service in the field of research.
what if i want to do multiple indices (NDVI, NDRE, EVI) extractions and plot them as well each one having its curve
See this thread for code sample for extracting multiple indices https://groups.google.com/g/google-earth-engine-developers/c/VK3kpXpAkTk/m/BRwddKk9AgAJ?utm_medium=email&utm_source=footer&pli=1
Thanks dear Ujaval am one your fan and your followers on YouTube and on LinkedIn.I want to ask you how can we extract For large training sample ( ex :45000 samples )using Modis NDVI time series from 2008-to 2019 .
Thanks Said.
You should be able to extract the large sample using the method described here. You should export the result and not print it in the Console (as it will time-out). Upload your data as a shapefile and try it out.
thanks you such dear , how can we apply them for time series classification
Time Series Classification is a completely different problem. I cover this in detail in my GEE course under ‘Supervised Classification and Change Detection’ https://courses.spatialthoughts.com/end-to-end-gee.html#introduction-to-change-detection
Hi, I have a data set as FeatureCollection and I want to use that instead of using numPoints:100. But, those points may or may not fall in cultivated areas. However, I want to select only those points which fall in cultivated areas, and then I want to downlead those points as csv. I am using CDL cropland layer for the year 2019 to get cultivated areas, but I am not sure how can I do the above? Can you pls help me? Thanks
One way would be to sample the image at all points and then filter all points where the extracted value was null.
Example https://code.earthengine.google.co.in/1b572e8fa65bda1d23996cefc3218f30
If I have polygons instead of points. Can I get the daily mean of NDVI in my timeserie?
You can use polygons without any changes to the code. It will compute the mean NDVI within each polygon.
Hi Ujaval,
Thank you so much for you detailed explanation. I’m trying to run your code for vast number of polygons (9000) for a year. It’s been 3 hours and it’s still loading. Any way to make it faster?
Hi Ujaval,
When I’m trying to use your code for vast number of polygons (around 9000). The running never stops.
code: https://code.earthengine.google.com/659f8888b8769f309766091229a28e2c
Could you please have a look at it?
Large jobs can take time (can take hours or days). Wait for the job to finish or fail.
To debug, try exporting small sample (5 polygons). Use .limit(5) on the collection to select 5 polygons and export. If that succeeds, then your code is fine and will finish eventually. There is nothing to be done to speed it up. (It will still be orders of magnitude faster than trying to do this on a desktop).
If the job fails, share the error and I will take a look.
Thank you. Once I’ve the result I’ll post it again
Something different: Can I make the CSV filled with -9999 value with every day of the year (e.g. jan 1, jan 2, jan3…), and fill it with actual data where I have the data?
Yes. Create an empty dictionary with day as the key and -9999 as value. Then merge it with the properties of the feature.
Here’s the code showing how (I have limited it to 2 points so you can print and verify the results)
https://code.earthengine.google.com/1c7c12ecccdf5306466b826cabe40a27
Hi ujaval,
I read your post and wondering whether I could extract the albedo time series from MODIS data within an area defined by a shapefile.
Kind regards,
Peter
The process should be very similar to the one described in the video in this post where I show how to extract MODIS NDVI time-series.
Thank you, your code very useful for me
my chart is working fine with modisNDVI but showing following error with scaledNDVI
“Error generating chart: Projection error: Unable to compute intersection of geometries in projections and .”
Below is link of my code:-
https://code.earthengine.google.com/?scriptPath=users%2Fsuruca%2Fsuresh_bishnoi%3ANDVI_timeseries
This is not the correct link for sharing code. You also need to make any uploaded asset public. Read this section to know the process. https://courses.spatialthoughts.com/end-to-end-gee.html#sharing-a-single-script
Once you share your script link, I will take a look.
I have made respective asset public and below is the link copied using specified process.
https://code.earthengine.google.com/?scriptPath=users%2Fsuruca%2Fsuresh_bishnoi%3ANDVI_timeseries
This is still not correct. Please review the instructions in the link I sent.
In above code, after using flatten() method I got 24 features i.e. 4 NDVI for each 6 geographies
4 feature collections with 14 columns (attributes of 6 geographies) but not able to see NDVI of these 6 geographies. Further NDVI value (one of the columns from these 14 columns is NDVI which is also not between -1 and 1. It seems it is total of all 6 geographies.
when I am clicking on it, it is getting open here and I followed exactly same process.
Is it possible that when it is open at my end, you are not able to open it ?
Your link will only open for you. To share the code with others, you have to use ‘Get Link’ and make your assets public. I am requesting again to read the link carefully and follow the process. This will avoid wasting everyone’s time. https://courses.spatialthoughts.com/end-to-end-gee.html#sharing-a-single-script
*now
https://code.earthengine.google.co.in/8a8df4c1335ee5ed35a60056ebbb96a6
I am extremely sorry for inconvenience. Above is the correct link.
Below is the link of updated script:-
https://code.earthengine.google.co.in/e9d98dbde85751432915248206b881b4
My above problem is solved just by replacing end date ‘2022-02-28′ with ’22-01-31’ in ee.Filter.date. and by incorporating ‘system:index’ in below statement.
var chart = ui.Chart.image.series(scaledNDVI, testFarm, ee.Reducer.mean(), 1000, ‘system:index’)
otherwise only by making above first change I was getting below error: –
“Error generating chart: No features contain non-null values of “system:time_start”.”
Though I could not understood the detail reasoning of both the above changes.
Hi Ujaval, Thank you for your work.
I have a shapefile containing three types of trees (Pinus, Eucalyptus, Araucaria). These 3 types are mixed and non recognizable from each other. I need to map only the Eucalyptus trees. I know that as an artificial forest, eucalyptus has a unique growth cycle: the wood is harvested and trees are replanted for another growth period every several years. I thought by calculating NDVI over the years and finding areas that NDVI lowers greatly every couple of years, I can distinguish Eucalyptus from other two trees. I wrote this script to calculate NDVI from Sentinel2 and add it as a band to image collection:
https://code.earthengine.google.com/f4ffa4241e494e37e313659b8684476e
I do not know how to draw chart of NDVI all over this area in order to find the points that NDVI lowers every few years.
I know NDVI time-series can easily be charted and shown over a point, but can it be done over an area in order to distinguish pixels? Thankss
Use ui.Chart.image.seriesByRegion(). Here’s an example doing an LST time series over multiple regions
https://code.earthengine.google.com/d10358d9651f2b339b39b6d4e3fd5332
Hello Ujaval,
Thank you for providing the tutorial it really assisted me with understanding how GEE works. I have adapted my code to calculate a kernel version of NDVI called ”KNDVI” into the code your provide but unfortunately I ran into a couple of errors such as the below:
Line 118: kndvi is not defined
collection: Tile error: User memory limit exceeded.
Line 118: Computation timed out
and “null”.
Can you kindly assist me with solving these errors thank you and the link to my code it below
https://code.earthengine.google.com/2aa9cf6bd4d5865067e26830845b0219
There are several typos where ndvi needs to be changed to kndvi. (since your band name is kndvi)
https://code.earthengine.google.com/03943c0bea4b5d24c1e9e4540f7f1c7c
Thank you for highlighting the typos. I have managed to resolve the errors but however the code wont load a the timeseries chart for a singe location as it once loaded the same location with the ‘NDVI’ coded in the tutorial. before I adapted the code to ‘kndvi’.
I have also tried making a batch code for the errors I constantly receive.
Can you kindly assist me.
https://code.earthengine.google.com/4f6bfe8a7d31e3cda2b7000c0d3c2428
Thank you for highlighting the typos. I have managed to resolve the errors but however the code wont load a the timeseries chart for a singe location as it once loaded the same location with the ‘NDVI’ coded in the tutorial. before I adapted the code to ‘kndvi’.
I have also tried making a batch code for the errors I constantly receive.
Can you kindly assist me.
https://code.earthengine.google.com/4f6bfe8a7d31e3cda2b7000c0d3c2428
Thank you for highlighting the typos. I have managed to resolve the errors but however the code wont load a the timeseries chart for a singe location as it once loaded the same location with the ‘NDVI’ coded in the tutorial. before I adapted the code to ‘kndvi’.
I have also tried making a batch code for the errors I constantly receive.
Can you kindly assist me.
https://code.earthengine.google.com/4f6bfe8a7d31e3cda2b7000c0d3c2428
Thank you for highlighting the typos. I have managed to resolve the errors but however the code wont load a the timeseries chart for a singe location as it once loaded the same location with the ‘NDVI’ coded in the tutorial. before I adapted the code to ‘kndvi’.
I have also tried making a batch code for the errors I constantly receive.
Can you kindly assist me.
https://code.earthengine.google.com/4f6bfe8a7d31e3cda2b7000c0d3c2428
Hello Ujaval Gandhi,
Thanks a lot for all the useful information in your tutorials! I have already learned to execute some operations in GEE, but now I have a task which I do not know how to do it:
I have some 1400 points of animal observations from Africa to Europe starting from year 1980 to 2021. In its CSV file it has the fields (among others) Lat., Long., Year, Month and Day of each observation. I am trying to obtain the specific NDVI values for a certain point according its place and date. For example, of a point with date 2009-05-13 I would like to obtain only the closest NDVI value for that point. Or for example in 2012 I have 16 points, of which one is of the 30th of June. For this point I do not want to obtain all the NDVI values of that year but only of June.
Do you know if there is a way to extract NDVI values for each year and according to the months of the points of that year (as in one year I have different observations points in different months)? And do you know how the script must look like? I hope you can help me solving this task.
Kind regards,
Henk
This can be done using Joins. See this example
https://code.earthengine.google.co.in/5bcd63b5e1b45bb462a328bea7217f12
You can learn more about Joins in EE at https://developers.google.com/earth-engine/guides/joins_intro
Hello Ujaval,
Thank you very much for your fast response and the example script! It worked perfectly with my own FeatureCollection and the ImageCollection of the example. Let’s see if I can apply it on other ImageCollections as well. Thanks again!
Kind regards,
Henk
Hello Ujaval,
Thanks to your videos and suggestions I have reached almost to my goal. As I wrote before, I am trying to obtain (and Extract) only the closest NDVI value for a certain point its date, lets say a point with date 2003_07_31 obtain the NDVI value of that point of the NDVI band with date 2003_07_28 (like it is the closest).
I have tried many things and my problem is that I have achieved two cases:
1) I have a Timeseries in which each point has 23 NDVI values of the whole year (like as “format.triplets” in your video Part5 (Earth Engine Guided Project)).
2) Each point is linked with closest NDVI image according its date by the property the “system:time_start” . That is exactly what is inside the script you have sent before.
Like in Case 2, do you know a way how to extract the specific NDVI value out of the band to which every point has been linked with?
Kind regards,
Henk
In Case2), you have to map() a function on the featurecollection that takes a feature and extracts the NDVI from the images stored in the property. Something like below
var ndviTimeSeries = joinResult.map(function(f) {
var images = ee.ImageCollection(f.get(‘images’))
var geometry = f.geometry()
// Now you have an image collection and a geometry
// Apply the code to do reduceRegion() at the geometry for each image
// return f with additional properties for NDVI values
}
Hello Ujaval,
Thanks for the tip! I understand your explanation. However, I tried several times to run it, but I do not get it formulated in the wright way I suppose, as I am getting errors all the time. Could you please take a look on my script and set this last part of the script and parameters as how it must be? Then you could also see the previously created variables and its properties. Probably when you see this last part you find the obstacle very fast and are it just a few simple things to set it correctly, but due to my lack of experience I do not get it.
https://code.earthengine.google.com/?scriptPath=users%2Fhenk90tol%2FNDVI%3AMOD13Q1_006Terra_18July
I have given you permission for the used asset.
Sorry for disturbing.
Kind regards,
Henk
Hi Ujaval,
Great post! Extremely helpful since there were several steps (i.e. the granule overlap problem) I had not identified. I would only add that I have found that the cloud/shadow masking using the cloud-displacement algorithm to be more reliable than any I’ve recently.
// Aggressively mask clouds and shadows.
function maskImage(image) {
// Compute the cloud displacement index from the L1C bands.
var cdi = ee.Algorithms.Sentinel2.CDI(image);
var s2c = image.select(‘probability’);
var cirrus = image.select(‘B10’).multiply(0.0001);
// Assume low-to-mid atmospheric clouds to be pixels where probability
// is greater than 65%, and CDI is less than -0.5. For higher atmosphere
// cirrus clouds, assume the cirrus band is greater than 0.01.
// The final cloud mask is one or both of these conditions.
var isCloud = s2c.gt(65).and(cdi.lt(-0.5)).or(cirrus.gt(0.01));
// Reproject is required to perform spatial operations at 20m scale.
// 20m scale is for speed, and assumes clouds don’t require 10m precision.
isCloud = isCloud.focal_min(3).focal_max(16);
isCloud = isCloud.reproject({crs: cdi.projection(), scale: 20});
// Project shadows from clouds we found in the last step. This assumes we’re working in
// a UTM projection.
var shadowAzimuth = ee.Number(90)
.subtract(ee.Number(image.get(‘MEAN_SOLAR_AZIMUTH_ANGLE’)));
// With the following reproject, the shadows are projected 5km.
isCloud = isCloud.directionalDistanceTransform(shadowAzimuth, 50);
isCloud = isCloud.reproject({crs: cdi.projection(), scale: 100});
isCloud = isCloud.select(‘distance’).mask();
return image.select(‘B2’, ‘B3’, ‘B4’).updateMask(isCloud.not());
}
Hi Bruce – Thanks for sharing the CDI algorithm. For anyone interested, here’s the full implementation for reference. https://code.earthengine.google.com/d6f6410125719923b2249bf0cc42ca34 and blog post https://medium.com/google-earth/more-accurate-and-flexible-cloud-masking-for-sentinel-2-images-766897a9ba5f
Hi! Great Job!
How can I select another point, instead of first(), to plot the daily data chart?
Use filter() to filter for the point and then use .first()
Hi Ujaval,
how do i get an image for an average NDVI shp of a time series and export it?
Dear Ujaval
I am trying to extract data from a time series image collection but it returns null values for all the crop types used for the extraction. Which should not be the case.
Here is the code I am working with. Can you kindly assist?
https://code.earthengine.google.com/696430378ca835373459fd0864decf9f
Your asset is not public
Make it readable by anyone and then I can take a look. Learn how to share assets correctly. https://www.youtube.com/watch?v=UYx7_RwY5CQ