[seek-dev] Re: implementing garp in ptolemy

Bertram Ludaescher ludaesch at sdsc.edu
Tue Nov 11 10:00:06 PST 2003


Dave: 

As Chad explained there are plans to provide a generic grid
service actor, somewhat similar but probably a bit more involved than
the current generic WSDL actor that Ilkay et al have provided.

Indeed, the details of the grid service actor are yet to be determined 
I think. For example, it may involve changes at the director level
(maybe a G-PN for Grid-enabled process networks!?)

I also agree with you that writing custom actors for Ptolemy may not
be necessary once those generic actors are in place. In fact, we have
discussed here the need for a "command-line/shell-execution" actor
that --when properly parameterized-- can also serve as a generic
plug-in for local applications. I suspect the WSDL itself to be often
sufficient for not too data-intensive apps.

The main initial bottleneck with Garp in Ptolemy is/was the lack of
detailed modeling of the input/output signatures of the individual
analysis and computation steps. It seems that Deana, Shawn, Rich, etc
who were meeting on that in SB have made good progress towards a
detailed API. 

Who is the one who currently "owns the action item"? ;-)

cheers

Bertram


>>>>> "CB" == Chad Berkley <berkley at nceas.ucsb.edu> writes:
CB> 
CB> Hey Dave,
CB> Thanks for the comments.  We are planning to create a grid service actor 
CB> for ptolemy.  Is there currently a service (or a plan to create a 
CB> service) that runs GARP processes?  If there is, I agree that we should 
CB> just use the grid service and focus on getting a grid service actor 
CB> working in ptolemy.
CB> 
CB> I'm not sure what you mean about the "state of the garp algorithm while 
CB> operating in such an environment".  could you clarify?
CB> 
CB> thanks,
CB> chad
CB> 
CB> Dave Vieglais wrote:
>> Hi all,
>> it should be fairly straight forward to build a low level java interface 
>> using SWIG.  I was able to do this for python with not too much effort. 
>> But the problem, as Ricardo mentions, is the interfaces to methods that 
>> can be readily exposed may need to be significantly altered to operate 
>> in the ptolemy environment.  I am also curious about how the state of 
>> the garp algorithm is maintained while operating in such an environment.
>> 
>> I have not been following the ptolemy development stream too closely, 
>> but I was wondering if there is a plan to provide a kind of generic 
>> interface between ptolemy and services exposed through a globus 
>> interface?  It may actually be a simpler process in the log run to 
>> provide a globus interface to garp (e.g. using the gSOAP library and the 
>> GSI plugin http://sara.unile.it/~cafaro/gsi-plugin.html) and have the 
>> garp - ptolemy interface actually implemented as a more generic globus 
>> interface.  This seems to make more sense that building wrappers for 
>> each algorithm just so they can be used in the ptolemy environment.
>> 
>> cheers,
>> Dave V.
>> 
>> Ricardo Scachetti Pereira wrote:
>> 
>>> Chad, Deana and all,
>>> 
>>> GARP is currently implemented as a C++ API (Application Programming 
>>> Interface) that can be put together easily to implement any of those 
>>> analytical steps that are present on the GARP pipelines we produced.
>>> The GARP API is very modular and the main modules match almost 
>>> precisely the various analytical steps described in each GARP pipeline.
>>> Still, we didn't specify exactly what each GARP analytical step 
>>> should do. I think that this was one point that Chad was complaining 
>>> about in his previous messages, wasn't it (not enough detail in the 
>>> pipeline specs)?    Once each analytical step is spec'ed out, one need 
>>> to code it using GARP API calls in C++ (each would be at most 4 lines 
>>> long!!). Then a Java class would wrap that C++ class. Or 
>>> alternatively, the analytical step could be implemented in a Java 
>>> class directly which, in turn would call the right methods in the GARP 
>>> API.
>>> That is how I see it. But I know nearly nothing about wrapping C++ 
>>> code in a Java class.
>>> Again, I am pretty sure that the underlying code won't add any 
>>> additional constraints.
>>> All that said, I suggest we implement each GARP analytical step 
>>> separately.
>>> The only tricky part (from my perspective) will be to feed GARP with 
>>> data in the right formats. For example, the environmental layers 
>>> should all match each other exactly (same extent and cell size) which 
>>> usually requires a resample operation prior to modeling. Also, each 
>>> cell value on those layers have to be normalized to fit a bit (1 to 
>>> 254) before it can be processed by the algorithm. The GARP API has 
>>> methods that do that normalization reading the (resampled data) from 
>>> layers in ESRI ASCII Raster Grid format.
>>> We can talk about those details on a conference call later this 
>>> week, can't we?
>>> Regards,
>>> 
>>> Ricardo
>>> 
>>> 
>>> 
>>>> Chad,
>>>> 
>>>> These questions were discussed meticulously in 
>>> 
>>> 
>>> the breakout group at
>>> 
>>>> Santa Barbara (see...you should have picked the 
>>> 
>>> 
>>> other group :-)
>>> 
>>>> The conclusion was that there are pieces within 
>>> 
>>> 
>>> GARP that it would be
>>> 
>>>> nice to reuse, specifically, the sampling piece 
>>> 
>>> 
>>> that splits the input
>>> 
>>>> samples into two groups (testing and training), 
>>> 
>>> 
>>> which is currently shown
>>> 
>>>> on the pipeline as a separate piece..  However, 
>>> 
>>> 
>>> Dave thinks it might be
>>> 
>>>> alot of work to change the code to make that work 
>>> 
>>> 
>>> (but suggested Ricardo
>>> 
>>>> might think differently).  If the GARP code can 
>>> 
>>> 
>>> be relatively easily
>>> 
>>>> rewritten to break out the steps that are in the 
>>> 
>>> 
>>> pipeline, then we
>>> 
>>>> should do that, otherwise we should lump it all 
>>> 
>>> 
>>> together as one step
>>> 
>>>> (which would make a much simpler pipeline).
>>>> 
>>>> Shawn has a revised species distribution pipeline 
>>> 
>>> 
>>> that you should make
>>> 
>>>> sure you have.
>>>> 
>>>> Deana
>>>> 
>>>> 
>>>> Ricardo Scachetti Pereira wrote:
>>>> 
>>>> 
>>>>> Hi, Chad and all,
>>>>> 
>>>>> I can provide all the details about GARP that 
>>>> 
>>> 
>>> you
>>> 
>>>>> need. I'm in a middle of a household move, but a phone call on 
>>>>> Wednesday afternoon or Thursday would be good for me.
>>>>> I'll be in Brazil which is now 6 hours ahead of California Time.
>>>>> Let me know whether that works for you.
>>>>> 
>>>>> Regards,
>>>>> 
>>>>> Ricardo
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>>> Hi All,
>>>>>> 
>>>>> I've been looking at trying to get GARP working  
>>>>> 
>>>>> 
>>>>> in ptolemy a bit, and I
>>>>> 
>>>>> 
>>>>> have a few questions.  First of all, is the  
>>>>> 
>>>>> 
>>>>> modularity that is described
>>>>> 
>>>>> 
>>>>> in the PPT files the modularity that you would  
>>>>> 
>>>>> 
>>>>> want in a ptolemy model?
>>>>> 
>>>>> 
>>>>> for example, is the "data calculation" step 
>>>>> 
>>> 
>>> in
>>> 
>>>>>  
>>>>> 
>>>>> 
>>>>> the "GARP Native
>>>>> 
>>>>> 
>>>>> Species Pipeline" something that you would want  
>>>>> 
>>>>> 
>>>>> to reuse in other
>>>>> 
>>>>> 
>>>>> pipelines?  When I talked to Dave when we were  
>>>>> 
>>>>> 
>>>>> here in SB, I got the
>>>>> 
>>>>> 
>>>>> impression that he thought GARP should just be  
>>>>> 
>>>>> 
>>>>> one atomic actor (is that
>>>>> 
>>>>> 
>>>>> really what you think Dave?)  If these 
>>>>> 
>>> 
>>> components
>>> 
>>>>>  
>>>>> 
>>>>> 
>>>>> are never going to be
>>>>> 
>>>>> 
>>>>> reused for any other pipeline, then they should  
>>>>> 
>>>>> 
>>>>> probably just be folded
>>>>> 
>>>>> 
>>>>> into one generic GARP actor.
>>>>>> 
>>>>> I'd like get together, either physically or  
>>>>> 
>>>>> 
>>>>> virtually, with someone that
>>>>> 
>>>>> 
>>>>> can explain to me the exact steps that it takes  
>>>>> 
>>>>> 
>>>>> to go from the training
>>>>> 
>>>>> 
>>>>> data and layers to output.  The specifications 
>>>>> 
>>> 
>>> of
>>> 
>>>>>  
>>>>> 
>>>>> 
>>>>> the modules within the
>>>>> 
>>>>> 
>>>>> PPT documents are a bit vague.  For instance, I  
>>>>> 
>>>>> 
>>>>> have no idea what data
>>>>> 
>>>>> 
>>>>> calculation does.  A mid-level pseudo-code  
>>>>> 
>>>>> 
>>>>> implementation of GARP might
>>>>> 
>>>>> 
>>>>> be a good idea, as it would help me get my head  
>>>>> 
>>>>> 
>>>>> around this thing a bit
>>>>> 
>>>>> 
>>>>> more.
>>>>>> 
>>>>> Could any of you have a phone call next week?   
>>>>> 
>>>>> 
>>>>> maybe wednesday since
>>>>> 
>>>>> 
>>>>> tuesday is a holiday?
>>>>>> 
>>>>> chad
>>>>> -- 
>>>>> -----------------------
>>>>> Chad Berkley
>>>>> National Center for
>>>>> Ecological Analysis
>>>>> and Synthesis (NCEAS)
>>>>> berkley at nceas.ucsb.edu
>>>>> -----------------------
>>>>>> 
>>>>>> 
>>>>>  
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>> 
>>>> -- 
>>>> ********
>>>> 
>>>> Deana D. Pennington, PhD
>>>> Long-term Ecological Research Network Office
>>>> 
>>>> UNM Biology Department
>>>> MSC03  2020
>>>> 1 University of New Mexico
>>>> Albuquerque, NM  87131-0001
>>>> 
>>>> 505-272-7288 (office)
>>>> 505 272-7080 (fax)
>>>> 
>>>> 
>>>> _______________________________________________
>>>> seek-dev mailing list
>>>> seek-dev at ecoinformatics.org
>>>> 
>>> 
>>> http://www.ecoinformatics.org/mailman/listinfo/seek-
>>> dev
>>> 
>>>> 
>>> 
CB> 
CB> 
CB> -- 
CB> -----------------------
CB> Chad Berkley
CB> National Center for
CB> Ecological Analysis
CB> and Synthesis (NCEAS)
CB> berkley at nceas.ucsb.edu
CB> -----------------------
CB> 
CB> _______________________________________________
CB> seek-dev mailing list
CB> seek-dev at ecoinformatics.org
CB> http://www.ecoinformatics.org/mailman/listinfo/seek-dev



More information about the Seek-dev mailing list