[kepler-dev] GridRescaler for GARP Workflow

Edward A. Lee eal at eecs.berkeley.edu
Tue Nov 16 21:40:57 PST 2004


I'm not sure how useful this will be, but in $PTII/ptolemy/math
there is a class called Interpolation.java that says:

    This class provides algorithms to do interpolation. Currently, zero,
    first, and third order interpolations are supported. These are the
    interpolation orders most often used in practice. zero order interpolation
    holds the last reference value; first order does linear interpolation;
    and third order interpolation is based on the Hermite curves in chapter
    11 of "Computer Graphic, Principles and Practice", by Foley, van Dam, 
Feiner
    and Hughes, 2nd ed. in C, 1996.

Possibly more relevant is the FIR filter actor, which handles both
interpolation and decimation, and works on streaming data, so not everything
needs to be in memory. It requires doing a reasonable job of filter design,
which can be done using MATLAB, for example.

The FIR filter is in $PTII/ptolemy/domains/sdf/lib.

Edward

At 11:10 AM 11/16/2004 -0800, Dan Higgins wrote:
>Hi All,
>
>Some thoughts/information:
>
>One of the actors needed for the GARP workflow is a 
>GridRescaling/Interpolation actor. It's basic purpose is to change the 
>extent and cellsize of grid of data. In particular, this is needed for 
>GARP because all the input layers must have the same extents and cell sizes.
>
>There are several ways of doing this. First of all, there is already an 
>actor that will create a grid from data (Interpolate actor, created by 
>Efrat and John Harris) which uses a IDW algorithm. This actor is designed 
>for sparse, irregular data, however rather than large regualar input grids 
>(e.g. *.asc files). John also created an R example, which again is for 
>sparse data (uses a X,Y,Z asci input).
>
>Other possibilities include the use of Grass routines or gdal. Chad is 
>getting gdal running for data input; its use for rescaling still needs to 
>be investigated. Questions include whether very large grids (too large to 
>fit in RAM) can be handles and what interpolation algorithms are available 
>(probably want at least nearest-neighbor and inverse-distance-weighting).
>
>One of the primary issues that we have to consider is rescaling very large 
>grids. One of the Hydro1k DEM layers has roughly 75 million cells. Simply 
>loading that grid into memory would exceed RAM on many machines. We thus 
>need code that does not require the entire grid to be loaded in RAM!
>
>I have also implemented some pure Java code for regridding *.ASC files 
>(i.e. dense, regularly spaced data sets). It turns out that the code for 
>working with regularly spaced data is much simpler than that for irregular 
>data (especially for nearest neighbor). (See Grid.java in 
>kepler/src/Grid.java). Nearest-neighbor and IDW algorithms have been 
>implemented. To handle large grids, I reused the PersistentVector class 
>which was created for handling very large datatables in Morpho. This code 
>can thus rescale (or 'cut' sections out of) large *.ASC files. Performance 
>seems to be reasonable, but testing is incomplete. In addition, an actor 
>wrapper is still being put together.
>
>Dan Higgins
>
>--
>*******************************************************************
>Dan Higgins                                  higgins at nceas.ucsb.edu
>http://www.nceas.ucsb.edu/   Ph: 805-892-2531
>National Center for Ecological Analysis and Synthesis (NCEAS) 735 State 
>Street - Room 205
>Santa Barbara, CA 93195
>*******************************************************************
>
>_______________________________________________
>kepler-dev mailing list
>kepler-dev at ecoinformatics.org
>http://www.ecoinformatics.org/mailman/listinfo/kepler-dev

------------
Edward A. Lee, Professor
518 Cory Hall, UC Berkeley, Berkeley, CA 94720
phone: 510-642-0455, fax: 510-642-2718
eal at eecs.Berkeley.EDU, http://ptolemy.eecs.berkeley.edu/~eal




More information about the Kepler-dev mailing list