launchpad-reviewers team mailing list archive
-
launchpad-reviewers team
-
Mailing list archive
-
Message #13422
[Merge] lp:~wallyworld/launchpad/markUserAffected-queries-1066647 into lp:launchpad
Ian Booth has proposed merging lp:~wallyworld/launchpad/markUserAffected-queries-1066647 into lp:launchpad.
Commit message:
Introduce bulk update operations for bugs with duplicates affecting users and bug heat updates
Requested reviews:
Launchpad code reviewers (launchpad-reviewers)
Related bugs:
Bug #1066647 in Launchpad itself: "Bug.markUserAffected makes a query for every duplicate bug"
https://bugs.launchpad.net/launchpad/+bug/1066647
For more details, see:
https://code.launchpad.net/~wallyworld/launchpad/markUserAffected-queries-1066647/+merge/129793
== Implementation ==
The bug markAffectedUser method was looping over all the dupes one by one. I changed it to do a bulk update of the affects table records. I needed to add a new update() method to lp.services.database.bulk to do the work since none was there already.
As part of the implementation, I killed off the IBUg.updateHeat() method and introduced a bulk update_bug_heat() helper. This eliminated the single calls to update heat for each dupe.
I added a new parameter to IBug.addChange() to prevent the heat from being updated inside that method (default is still True). This allows situations where addChange() is called in a loop to do one bulk update of heat at the end. I used this in the dupe handling.
== Tests ==
Update bug-heat.txt tests to use new helper method.
Add test for the new bulk update.
Existing tests for bug behaviour will pick up any regressions.
== Lint ==
Checking for conflicts and issues in changed files.
Linting changed files:
lib/lp/bugs/doc/bug-heat.txt
lib/lp/bugs/interfaces/bug.py
lib/lp/bugs/model/bug.py
lib/lp/services/database/bulk.py
lib/lp/services/database/tests/test_bulk.py
--
https://code.launchpad.net/~wallyworld/launchpad/markUserAffected-queries-1066647/+merge/129793
Your team Launchpad code reviewers is requested to review the proposed merge of lp:~wallyworld/launchpad/markUserAffected-queries-1066647 into lp:launchpad.
=== modified file 'lib/lp/bugs/doc/bug-heat.txt'
--- lib/lp/bugs/doc/bug-heat.txt 2012-04-19 20:56:13 +0000
+++ lib/lp/bugs/doc/bug-heat.txt 2012-10-16 02:07:21 +0000
@@ -15,8 +15,8 @@
Updating bug heat on-the-fly
----------------------------
-The IBug.updateHeat() method updates a Bug's heat using data already in
-the database. updateHeat() uses a stored procedure in the database to
+The update_bug_heat method updates a Bug's heat using data already in
+the database. update_bug_heat() uses a stored procedure in the database to
calculate the heat overall.
We'll create a new bug with a heat of 0 for the sake of testing.
@@ -26,11 +26,12 @@
>>> from zope.security.proxy import removeSecurityProxy
>>> removeSecurityProxy(bug).heat = 0
-Calling updateHeat() will update the bug's heat. Since this new bug has
+Calling update_bug_heat() will update the bug's heat. Since this new bug has
one subscriber (the bug owner) and one affected user (ditto) its
heat after update will be 6.
- >>> bug.updateHeat()
+ >>> from lp.bugs.model.bug import update_bug_heat
+ >>> update_bug_heat([bug.id])
>>> bug.heat
6
=== modified file 'lib/lp/bugs/interfaces/bug.py'
--- lib/lp/bugs/interfaces/bug.py 2012-09-17 16:13:40 +0000
+++ lib/lp/bugs/interfaces/bug.py 2012-10-16 02:07:21 +0000
@@ -702,7 +702,7 @@
If a BugActivity instance is provided as an `activity`, it is linked
to the notification."""
- def addChange(change, recipients=None):
+ def addChange(change, recipients=None, update_heat=True):
"""Record a change to the bug.
:param change: An `IBugChange` instance from which to take the
@@ -710,6 +710,7 @@
:param recipients: A set of `IBugNotificationRecipient`s to whom
to send notifications about this change. If None is passed
the default list of recipients for the bug will be used.
+ :param update_heat: Whether to update the bug heat.
"""
@operation_parameters(
@@ -979,9 +980,6 @@
Return None if no bugtask was edited.
"""
- def updateHeat():
- """Update the heat for the bug."""
-
class IBug(IBugPublic, IBugView, IBugEdit, IHasLinkedBranches):
"""The core bug entry."""
=== modified file 'lib/lp/bugs/model/bug.py'
--- lib/lp/bugs/model/bug.py 2012-10-12 11:31:52 +0000
+++ lib/lp/bugs/model/bug.py 2012-10-16 02:07:21 +0000
@@ -192,6 +192,7 @@
from lp.registry.model.pillar import pillar_sort_key
from lp.registry.model.teammembership import TeamParticipation
from lp.services.config import config
+from lp.services.database import bulk
from lp.services.database.constants import UTC_NOW
from lp.services.database.datetimecol import UtcDateTimeCol
from lp.services.database.decoratedresultset import DecoratedResultSet
@@ -306,6 +307,22 @@
self.user = user
+def update_bug_heat(bug_ids):
+ """Update the heat for the specified bugs."""
+ # We need to flush the store first to ensure that changes are
+ # reflected in the new bug heat total.
+ if not bug_ids:
+ return
+ store = IStore(Bug)
+ store.flush()
+
+ store.find(
+ Bug, Bug.id.is_in(bug_ids)).set(
+ heat=SQL('calculate_bug_heat(Bug.id)'),
+ heat_last_updated=UTC_NOW)
+ store.flush()
+
+
class Bug(SQLBase, InformationTypeMixin):
"""A bug."""
@@ -823,7 +840,7 @@
if suppress_notify is False:
notify(ObjectCreatedEvent(sub, user=subscribed_by))
- self.updateHeat()
+ update_bug_heat([self.id])
return sub
def unsubscribe(self, person, unsubscribed_by, **kwargs):
@@ -857,7 +874,7 @@
# flushed so that code running with implicit flushes
# disabled see the change.
store.flush()
- self.updateHeat()
+ update_bug_heat([self.id])
del get_property_cache(self)._known_viewers
# Revoke access to bug
@@ -1110,7 +1127,8 @@
bug=self, is_comment=True,
message=message, recipients=recipients, activity=activity)
- def addChange(self, change, recipients=None, deferred=False):
+ def addChange(self, change, recipients=None, deferred=False,
+ update_heat=True):
"""See `IBug`."""
when = change.when
if when is None:
@@ -1142,7 +1160,8 @@
recipients=recipients, activity=activity,
deferred=deferred)
- self.updateHeat()
+ if update_heat:
+ update_bug_heat([self.id])
def expireNotifications(self):
"""See `IBug`."""
@@ -1767,7 +1786,7 @@
if already_subscribed_teams.is_empty():
self.subscribe(s, who)
- self.updateHeat()
+ update_bug_heat([self.id])
# As a result of the transition, some subscribers may no longer
# have access to the bug. We need to run a job to remove any such
@@ -1870,23 +1889,28 @@
bap = self._getAffectedUser(user)
if bap is None:
BugAffectsPerson(bug=self, person=user, affected=affected)
- self._flushAndInvalidate()
else:
if bap.affected != affected:
bap.affected = affected
- self._flushAndInvalidate()
-
- # Loop over dupes.
- for dupe in self.duplicates:
- if dupe._getAffectedUser(user) is not None:
- dupe.markUserAffected(user, affected)
+
+ dupe_bug_ids = [dupe.id for dupe in self.duplicates]
+ # Where BugAffectsPerson records already exist for each duplicate,
+ # update the affected status.
+ if dupe_bug_ids:
+ bulk.update(
+ where=And(
+ BugAffectsPerson.person == user,
+ BugAffectsPerson.bugID.is_in(dupe_bug_ids)),
+ col_values={BugAffectsPerson.affected: affected})
+
+ self._flushAndInvalidate()
if affected:
self.maybeConfirmBugtasks()
- self.updateHeat()
+ update_bug_heat(dupe_bug_ids + [self.id])
- def _markAsDuplicate(self, duplicate_of):
+ def _markAsDuplicate(self, duplicate_of, affected_bug_ids):
"""Mark this bug as a duplicate of another.
Marking a bug as a duplicate requires a recalculation of the
@@ -1905,7 +1929,7 @@
user = getUtility(ILaunchBag).user
for duplicate in self.duplicates:
old_value = duplicate.duplicateof
- duplicate._markAsDuplicate(duplicate_of)
+ duplicate._markAsDuplicate(duplicate_of, affected_bug_ids)
# Put an entry into the BugNotification table for
# later processing.
change = BugDuplicateChange(
@@ -1915,12 +1939,15 @@
new_value=duplicate_of)
empty_recipients = BugNotificationRecipients()
duplicate.addChange(
- change, empty_recipients, deferred=True)
+ change, empty_recipients, deferred=True,
+ update_heat=False)
+ affected_bug_ids.add(duplicate.id)
self.duplicateof = duplicate_of
except LaunchpadValidationError as validation_error:
raise InvalidDuplicateValue(validation_error)
if duplicate_of is not None:
+ affected_bug_ids.add(duplicate_of.id)
# Maybe confirm bug tasks, now that more people might be affected
# by this bug from the duplicates.
duplicate_of.maybeConfirmBugtasks()
@@ -1928,13 +1955,13 @@
# Update the former duplicateof's heat, as it will have been
# reduced by the unduping.
if current_duplicateof is not None:
- current_duplicateof.updateHeat()
+ affected_bug_ids.add(current_duplicateof.id)
def markAsDuplicate(self, duplicate_of):
"""See `IBug`."""
- self._markAsDuplicate(duplicate_of)
- if duplicate_of is not None:
- duplicate_of.updateHeat()
+ affected_bug_ids = set()
+ self._markAsDuplicate(duplicate_of, affected_bug_ids)
+ update_bug_heat(affected_bug_ids)
def setCommentVisibility(self, user, comment_number, visible):
"""See `IBug`."""
@@ -2067,17 +2094,6 @@
return not subscriptions_from_dupes.is_empty()
- def updateHeat(self):
- """See `IBug`."""
- # We need to flush the store first to ensure that changes are
- # reflected in the new bug heat total.
- store = Store.of(self)
- store.flush()
-
- self.heat = SQL("calculate_bug_heat(%s)" % sqlvalues(self))
- self.heat_last_updated = UTC_NOW
- store.flush()
-
def _reconcileAccess(self):
# reconcile_access_for_artifact will only use the pillar list if
# the information type is private. But affected_pillars iterates
@@ -2629,7 +2645,7 @@
notify(event)
# Calculate the bug's initial heat.
- bug.updateHeat()
+ update_bug_heat([bug.id])
if not notify_event:
return bug, event
=== modified file 'lib/lp/services/database/bulk.py'
--- lib/lp/services/database/bulk.py 2012-04-06 17:28:25 +0000
+++ lib/lp/services/database/bulk.py 2012-10-16 02:07:21 +0000
@@ -10,6 +10,7 @@
'load_referencing',
'load_related',
'reload',
+ 'update',
]
@@ -27,6 +28,7 @@
Insert,
Or,
SQL,
+ Update,
)
from storm.info import (
get_cls_info,
@@ -199,7 +201,7 @@
get_primary_keys=False):
"""Create a large number of objects efficiently.
- :param cols: The Storm columns to insert values into. Must be from a
+ :param columns: The Storm columns to insert values into. Must be from a
single class.
:param values: A list of lists of values for the columns.
:param get_objects: Return the created objects.
@@ -243,3 +245,33 @@
else:
IStore(cls).execute(Insert(db_cols, values=db_values))
return None
+
+
+def update(where, col_values, values=None):
+ """Update a large number of objects efficiently.
+
+ :param col_values: Dictionary mapping columns to values, or the Storm
+ columns to update. Must be from a single class.
+ :param values: None, or a list of values for the columns. Only required
+ if col_values contains a list of columns rather than a dict.
+ :return: the number of records updated.
+ """
+ # Flatten Reference faux-columns into their primary keys.
+ columns = tuple(col_values)
+ db_cols = list(chain.from_iterable(map(_dbify_column, columns)))
+ clses = set(col.cls for col in db_cols)
+ if len(clses) != 1:
+ raise ValueError(
+ "The Storm columns to insert values into must be from a single "
+ "class.")
+ [cls] = clses
+
+ # Mangle our value list into compilable values. Normal columns just
+ # get passed through the variable factory, while References get
+ # squashed into primary key variables.
+ values = values or col_values.values()
+ db_values = list(chain.from_iterable(
+ _dbify_value(col, val) for col, val in zip(columns, values)))
+
+ return IStore(cls).execute(
+ Update(dict(zip(db_cols, db_values)), where=where)).rowcount
=== modified file 'lib/lp/services/database/tests/test_bulk.py'
--- lib/lp/services/database/tests/test_bulk.py 2012-03-13 00:45:33 +0000
+++ lib/lp/services/database/tests/test_bulk.py 2012-10-16 02:07:21 +0000
@@ -43,6 +43,7 @@
from lp.services.job.model.job import Job
from lp.soyuz.model.component import Component
from lp.testing import (
+ person_logged_in,
StormStatementRecorder,
TestCase,
TestCaseWithFactory,
@@ -344,3 +345,117 @@
SQL("CURRENT_TIMESTAMP AT TIME ZONE 'UTC'"),
BugNotificationLevel.LIFECYCLE)], get_objects=True)
self.assertEqual(get_transaction_timestamp(), sub.date_created)
+
+
+class TestUpdate(TestCaseWithFactory):
+
+ layer = DatabaseFunctionalLayer
+
+ def _assert_references_and_enums(self, perform_update):
+ # update() correctly compiles plain types, enums and references.
+ bug = self.factory.makeBug()
+ people = [self.factory.makePerson() for i in range(5)]
+
+ with person_logged_in(bug.owner):
+ for person in people:
+ bug.subscribe(
+ person, bug.owner, level=BugNotificationLevel.LIFECYCLE)
+ new_subscribed_by = self.factory.makePerson()
+ recorder, updated_count = perform_update(bug, new_subscribed_by)
+ self.assertThat(recorder, HasQueryCount(Equals(1)))
+ self.assertEqual(6, updated_count)
+
+ transaction.commit()
+ store = Store.of(bug)
+ store.flush()
+ store.invalidate(bug)
+
+ subs = list(bug.subscriptions)
+ self.assertEqual(6, len(subs))
+ for sub in subs:
+ self.assertEqual(new_subscribed_by, sub.subscribed_by)
+ self.assertEqual(
+ BugNotificationLevel.COMMENTS, sub.bug_notification_level)
+
+ def test_references_and_enums_using_map(self):
+ # update() correctly compiles plain types, enums and references when
+ # passed in as a map of columns->values.
+
+ def perform_update(bug, new_subscribed_by):
+ with StormStatementRecorder() as recorder:
+ updated_count = bulk.update(
+ BugSubscription.bug_id == bug.id,
+ {BugSubscription.subscribed_by: new_subscribed_by,
+ BugSubscription.bug_notification_level:
+ BugNotificationLevel.COMMENTS})
+ return recorder, updated_count
+
+ self._assert_references_and_enums(perform_update)
+
+ def test_references_and_enums_using_list(self):
+ # update() correctly compiles plain types, enums and references when
+ # passed in as as separate lists of columns and values.
+
+ def perform_update(bug, new_subscribed_by):
+ with StormStatementRecorder() as recorder:
+ updated_count = bulk.update(
+ BugSubscription.bug_id == bug.id,
+ [BugSubscription.subscribed_by,
+ BugSubscription.bug_notification_level],
+ [new_subscribed_by, BugNotificationLevel.COMMENTS])
+ return recorder, updated_count
+
+ self._assert_references_and_enums(perform_update)
+
+ def test_null_reference(self):
+ # update() handles None as a Reference value.
+ job = IStore(Job).add(Job())
+ branch = self.factory.makeBranch()
+ [branchjob] = bulk.create(
+ (BranchJob.branch, BranchJob.job, BranchJob.job_type),
+ [(branch, job, BranchJobType.RECLAIM_BRANCH_SPACE)],
+ get_objects=True)
+ bulk.update(
+ BranchJob.jobID == branchjob.job.id, {BranchJob.branch: None})
+
+ transaction.commit()
+ store = Store.of(branchjob)
+ store.flush()
+ store.invalidate(branchjob)
+
+ self.assertEqual(None, branchjob.branch)
+
+ def test_fails_on_multiple_classes(self):
+ # update() only works with columns on a single class.
+ self.assertRaises(
+ ValueError, bulk.update, True,
+ (BugSubscription.bug, BranchSubscription.branch))
+
+ def test_fails_on_reference_mismatch(self):
+ # update() handles Reference columns in a typesafe manner.
+ self.assertRaisesWithContent(
+ RuntimeError, "Property used in an unknown class",
+ bulk.update, True,
+ {BugSubscription.bug: self.factory.makeBranch()})
+
+ def test_sql_passed_through(self):
+ # update() passes SQL() expressions through untouched.
+ bug = self.factory.makeBug()
+ person = self.factory.makePerson()
+
+ with person_logged_in(bug.owner):
+ bug.subscribe(
+ person, bug.owner, level=BugNotificationLevel.LIFECYCLE)
+ bulk.update(
+ BugSubscription.bug_id == bug.id,
+ {BugSubscription.date_created:
+ SQL("CURRENT_TIMESTAMP AT TIME ZONE 'UTC'")})
+
+ expected_timestamp = get_transaction_timestamp()
+ transaction.commit()
+ store = Store.of(bug)
+ store.flush()
+ store.invalidate(bug)
+
+ sub = bug.subscriptions[0]
+ self.assertEqual(expected_timestamp, sub.date_created)
Follow ups