I would like to request to use the XL instance to run frequency transforms of TESS light curves, which should take about 8 hours and result in about 12 GB of data.
Details: I am working on a similarity search for light curves as a potential new MAST search feature. Doing so requires running a frequency transform on the light curves. So far I have run successful tests for ~10,000 light curves, and I would like to scale up to include all of the TESS 2-minute cadence light curves (1.6 million). The 128 cores of the XL instance are desired to make the runtime more manageable.
The frequency transforms (from wavelet analysis) are saved as 64x64 8-bit unsigned integer arrays, which would yield 6.55 GB, but the estimate of 12 GB accounts for overhead of .npy files and the hierarchical structure used to store them.