openerp-community-reviewer team mailing list archive
-
openerp-community-reviewer team
-
Mailing list archive
-
Message #06192
[Merge] lp:~numerigraphe-team/ocb-addons/7.0-fill-inventory-OOM into lp:ocb-addons
Loïc Bellier - Numérigraphe has proposed merging lp:~numerigraphe-team/ocb-addons/7.0-fill-inventory-OOM into lp:ocb-addons.
Requested reviews:
Lionel Sausin - Numérigraphe (lionel-sausin)
OpenERP Community Backports Team (ocb)
Related bugs:
Bug #1312045 in OpenERP Community Backports (Addons): "Importing inventory pushes the OS to out-of-memory condition when many Stock Moves exist"
https://bugs.launchpad.net/ocb-addons/+bug/1312045
For more details, see:
https://code.launchpad.net/~numerigraphe-team/ocb-addons/7.0-fill-inventory-OOM/+merge/217049
We're having problems importing an inventory for a location that contains >630000 Stock Moves.
The problem seems to lie in the python loop based on the browse() of the Stock Moves: it preloads too much data and pushes the OS to an out-of-memory condition (which Linux responds to by killing the openerp server).
This patch solve this bug by splitting stock moves lines ids on separated list of 10 000 lines
Run GREEN on runbot.
MP on standard 7.0
https://code.launchpad.net/~numerigraphe-team/openobject-addons/7.0-fill-inventory-OOM/+merge/217031
--
https://code.launchpad.net/~numerigraphe-team/ocb-addons/7.0-fill-inventory-OOM/+merge/217049
Your team OpenERP Community Backports Team is requested to review the proposed merge of lp:~numerigraphe-team/ocb-addons/7.0-fill-inventory-OOM into lp:ocb-addons.
=== modified file 'stock/wizard/stock_fill_inventory.py'
--- stock/wizard/stock_fill_inventory.py 2014-03-24 09:14:18 +0000
+++ stock/wizard/stock_fill_inventory.py 2014-04-24 13:47:38 +0000
@@ -22,6 +22,7 @@
from openerp.osv import fields, osv, orm
from openerp.tools.translate import _
from openerp.tools import mute_logger
+import math
class stock_fill_inventory(osv.osv_memory):
_name = "stock.fill.inventory"
@@ -103,28 +104,31 @@
for location in location_ids:
datas = {}
res[location] = {}
- move_ids = move_obj.search(cr, uid, ['|',('location_dest_id','=',location),('location_id','=',location),('state','=','done')], context=context)
+ all_move_ids = move_obj.search(cr, uid, ['|',('location_dest_id','=',location),('location_id','=',location),('state','=','done')], context=context)
local_context = dict(context)
local_context['raise-exception'] = False
- for move in move_obj.browse(cr, uid, move_ids, context=context):
- lot_id = move.prodlot_id.id
- prod_id = move.product_id.id
- if move.location_dest_id.id != move.location_id.id:
- if move.location_dest_id.id == location:
- qty = uom_obj._compute_qty_obj(cr, uid, move.product_uom,move.product_qty, move.product_id.uom_id, context=local_context)
- else:
- qty = -uom_obj._compute_qty_obj(cr, uid, move.product_uom,move.product_qty, move.product_id.uom_id, context=local_context)
-
-
- if datas.get((prod_id, lot_id)):
- qty += datas[(prod_id, lot_id)]['product_qty']
-
- # Floating point sum could introduce tiny rounding errors :
- # Use the UoM API for the rounding (same UoM in & out).
- qty = uom_obj._compute_qty_obj(cr, uid,
- move.product_id.uom_id, qty,
- move.product_id.uom_id)
- datas[(prod_id, lot_id)] = {'product_id': prod_id, 'location_id': location, 'product_qty': qty, 'product_uom': move.product_id.uom_id.id, 'prod_lot_id': lot_id}
+ # due to memory high usage, i made smaller lists of 10 000 elements
+ for i in range(int(math.ceil(len(all_move_ids) / 10000.0))):
+ move_ids = all_move_ids[i * 10000: (i + 1) * 10000]
+ for move in move_obj.browse(cr, uid, move_ids, context=context):
+ lot_id = move.prodlot_id.id
+ prod_id = move.product_id.id
+ if move.location_dest_id.id != move.location_id.id:
+ if move.location_dest_id.id == location:
+ qty = uom_obj._compute_qty_obj(cr, uid, move.product_uom,move.product_qty, move.product_id.uom_id, context=local_context)
+ else:
+ qty = -uom_obj._compute_qty_obj(cr, uid, move.product_uom,move.product_qty, move.product_id.uom_id, context=local_context)
+
+
+ if datas.get((prod_id, lot_id)):
+ qty += datas[(prod_id, lot_id)]['product_qty']
+
+ # Floating point sum could introduce tiny rounding errors :
+ # Use the UoM API for the rounding (same UoM in & out).
+ qty = uom_obj._compute_qty_obj(cr, uid,
+ move.product_id.uom_id, qty,
+ move.product_id.uom_id)
+ datas[(prod_id, lot_id)] = {'product_id': prod_id, 'location_id': location, 'product_qty': qty, 'product_uom': move.product_id.uom_id.id, 'prod_lot_id': lot_id}
if datas:
flag = True
References