← Back to team overview

openerp-connector-community team mailing list archive

connector design advice

 

Hi all,

I wrote a custom connector syncing (import only) with a list of XML records that is provided by a remote server.
This server doesn’t allow calling for individual records as an API would do, nop, only bunches of data.
So I get my list of partners only once  per BatchImport call on partners.

So far my backend adapter was storing the parsed XML retrieved from the SOAP call then :
> parsing it into a dict, then :
> storing the dict in the backenAdapter object (the one doing the _call).

When running the import of partners, the search() retrieves ids from the dict in cache,
as well as read() does get the records in the same way.

So far this trick works as long as I’m in DirectBatchImport mode. 

I would now like to import records asynchronously.
Whenever I tried to delay() my jobs, I lost the cache of records stored in the backendObject, 
obviously because that object didn’t exist anymore when the job was processed.

Therefore my question is :

Where would be the best object / place to store the records cache so that the data is persisted when the a job is processed ? 

Thanks for your time,
Nicolas