← Back to team overview

launchpad-reviewers team mailing list archive

[Merge] ~pappacena/launchpad:storm_select_related into launchpad:master

 

Thiago F. Pappacena has proposed merging ~pappacena/launchpad:storm_select_related into launchpad:master.

Commit message:
[DONT MERGE] Adding an easy way to select related objects on Storm (discussion only)

Requested reviews:
  Launchpad code reviewers (launchpad-reviewers)

For more details, see:
https://code.launchpad.net/~pappacena/launchpad/+git/launchpad/+merge/377968

I'm creating this MP in order to have a place to discuss an initial idea. Please don't merge it, since it's probably a better idea to propose this change on Storm, if others would think this could be a good idea.

One problem we have today in LP is that, in order to prefetch related objects (reviewer Person from PackageUploadLog, for example), we are first fetching the list of the first objects (PackageUploadLog), getting the related IDs and issuing another query to fetch the second class of objects (reviewers/Person).

This way is obviously better than doing one query per PackageUploadLog, but it's still one query worst than it would be if we just LEFT JOIN PackageUploadLog with Person in this simple scenario (ignoring the other complexities of Person entity here).

Django has an easy way to solve this with something like `PackageUploadLog.objects.all().select_related('reviewer')`. I couldn't find an easy way to do it on Storm ORM without manually joining the table and setting the condition. Way more verbose than Django requires.

The idea here is to implement something similar to what we have in Django:
`resultset = store.find(PackageUploadLog).select_related(PackageUploadLog.reviewer)` would join both tables with a left join, fill the store with all objects in a way that, when doing `[p.reviwer for p in resultset]` after, it would fetch from local store and no extra query would be needed.

This branch already works and doesn't introduce regressions on LP (although the code is terribly ugly and not well tested; I would need to work the implementation details with Storm maintainers - if any!)

Implementing something similar to Django's `prefetch_related` (to fetch multiple objects) could be a future possibility.
-- 
Your team Launchpad code reviewers is requested to review the proposed merge of ~pappacena/launchpad:storm_select_related into launchpad:master.
diff --git a/database/schema/security.cfg b/database/schema/security.cfg
index 06da0d4..2844a30 100644
--- a/database/schema/security.cfg
+++ b/database/schema/security.cfg
@@ -1,4 +1,4 @@
-# Copyright 2009-2019 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2020 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 #
 # Possible permissions: SELECT, INSERT, UPDATE, EXECUTE
@@ -1080,6 +1080,7 @@ public.packagesetgroup                          = SELECT, INSERT
 public.packagesetinclusion                      = SELECT, INSERT
 public.packagesetsources                        = SELECT, INSERT
 public.packageupload                            = SELECT, INSERT, UPDATE
+public.packageuploadlog                         = SELECT
 public.packageuploadsource                      = SELECT
 public.packageuploadbuild                       = SELECT
 public.packageuploadcustom                      = SELECT, INSERT
@@ -1242,6 +1243,7 @@ public.ociprojectname                   = SELECT, INSERT, UPDATE
 public.ociprojectseries                 = SELECT, INSERT, UPDATE, DELETE
 public.openididentifier                 = SELECT
 public.packageupload                    = SELECT, INSERT, UPDATE
+public.packageuploadlog                 = SELECT, INSERT
 public.packageuploadbuild               = SELECT, INSERT, UPDATE
 public.packageuploadcustom              = SELECT, INSERT, UPDATE
 public.packageuploadsource              = SELECT, INSERT, UPDATE
@@ -2293,6 +2295,7 @@ public.packagediff                      = SELECT, UPDATE
 public.packageset                       = SELECT, UPDATE
 public.packagesetgroup                  = SELECT, UPDATE
 public.packageupload                    = SELECT, UPDATE
+public.packageuploadlog                 = SELECT, UPDATE
 public.packaging                        = SELECT, UPDATE
 public.person                           = SELECT, UPDATE
 public.personlanguage                   = SELECT, UPDATE
diff --git a/lib/lp/services/database/configure.zcml b/lib/lp/services/database/configure.zcml
index 3795b28..d21caea 100644
--- a/lib/lp/services/database/configure.zcml
+++ b/lib/lp/services/database/configure.zcml
@@ -17,4 +17,64 @@
         <allow attributes="__getslice__ get_plain_result_set" />
     </class>
 
+    <class class="lp.services.database.select_related.ResultSet">
+        <allow attributes="
+            __class__
+            __contains__
+            __delattr__
+            __dict__
+            __doc__
+            __format__
+            __getattribute__
+            __getitem__
+            __getslice__
+            __hash__
+            __implemented__
+            __init__
+            __iter__
+            __module__
+            __new__
+            __providedBy__
+            __provides__
+            __reduce__
+            __reduce_ex__
+            __repr__
+            __setattr__
+            __sizeof__
+            __str__
+            __subclasshook__
+            __weakref__
+            _aggregate
+            _any
+            _get_select
+            _load_objects
+            _set_expr
+            any
+            avg
+            cached
+            config
+            copy
+            count
+            difference
+            find
+            first
+            get_select_expr
+            group_by
+            having
+            intersection
+            is_empty
+            last
+            max
+            min
+            one
+            order_by
+            remove
+            set
+            sum
+            union
+            values
+            select_related"
+        />
+    </class>
+
 </configure>
diff --git a/lib/lp/services/database/select_related.py b/lib/lp/services/database/select_related.py
new file mode 100644
index 0000000..eabf546
--- /dev/null
+++ b/lib/lp/services/database/select_related.py
@@ -0,0 +1,145 @@
+from __future__ import absolute_import, print_function, unicode_literals
+
+__metaclass__ = type
+
+from storm import Undef
+from storm.expr import LeftJoin, compare_columns
+from storm.info import get_cls_info, ClassAlias
+from storm.properties import PropertyColumn
+from storm.store import (
+    ResultSet as StormResultSet, FindSpec as StormFindSpec, Store)
+
+
+class FindSpec(StormFindSpec):
+    def __init__(self, cls_spec, *args, **kwargs):
+        super(FindSpec, self).__init__(cls_spec, *args, **kwargs)
+        self.cls_spec = cls_spec
+        self._to_select_related = {}
+
+    @staticmethod
+    def tuplify(obj):
+        """Transform a given object in a tuple, if it's not yet one"""
+        if not isinstance(obj, tuple):
+            return (obj, )
+        return obj
+
+    def add_select_related(self, reference, cls_alias):
+        self._to_select_related[reference] = cls_alias
+
+    def filter_values_for_related(self, values, reference):
+        """From the given row, filter only the values refering to the given
+        reference
+
+        In summary, this method returns a tuple in the same format that
+        would be returned by selecting a row for only the given Reference
+        attribute (or None, for the main cls_spec given)
+        """
+        # If called with store.find(Column), do not try to filter anything
+        skip_filter = isinstance(self.cls_spec, PropertyColumn)
+        # If no select_related was asked, do not try to filter also
+        skip_filter = skip_filter or not len(self._to_select_related)
+        if skip_filter:
+            return values
+
+        cols, tables = self.get_columns_and_tables()
+        if reference is None:
+            table_alias = self.tuplify(self.cls_spec)
+        else:
+            table_alias = [self._to_select_related[reference]]
+
+        target_cols = set([i for i in cols if i.table in table_alias])
+        return tuple(
+            [value for col, value in zip(cols, values)
+             if col in target_cols])
+
+    def load_related_objects(self, store, result, values):
+        """Load into the store the objects fetched through
+        ResultSet.select_related (stored here as self._to_select_related)
+        """
+        for ref, cls_alias in self._to_select_related.items():
+            cls_info = get_cls_info(ref._relation.remote_cls)
+            ref_values = self.filter_values_for_related(values, ref)
+            try:
+                store._load_object(cls_info, result, ref_values)
+            except Exception, e:
+                import ipdb; ipdb.set_trace()
+
+    def load_objects(self, store, result, values):
+        """Load into the store the objects from this FindSpec, and also the
+        """
+        self.load_related_objects(store, result, values)
+        cls_values = self.filter_values_for_related(values, None)
+        result = super(FindSpec, self).load_objects(
+            store, result, cls_values)
+
+        # If the cls_spec contains spec of select_related things, exclude
+        # them from the final result
+        related_slice = len(self._to_select_related)
+        if related_slice:
+            result = result[:-related_slice]
+
+            # If it only got back a tuple because of select_related, returns
+            # the only object remaining (rather then a tuple)
+            if len(self.cls_spec) - 1 == related_slice:
+                return result[0]
+            else:
+                return result
+        return result
+
+
+class ResultSet(StormResultSet):
+    def __init__(self, *args, **kwargs):
+        super(ResultSet, self).__init__(*args, **kwargs)
+        self._to_select_related = []
+
+    def _get_left_join_condition(self, rel, cls_alias):
+        # Rebuild the foreign key condition using cls_alias instead of the
+        # original class name
+        remote_alias_key = tuple(
+            getattr(cls_alias, r.name) for r in rel.remote_key)
+        return compare_columns(rel.local_key, remote_alias_key)
+
+    def select_related(self, *args):
+        if len(args) == 0:
+            return self
+        rs = self.copy()
+
+        cls_spec = self._find_spec.cls_spec
+        if not isinstance(cls_spec, tuple):
+            cls_spec = (cls_spec,)
+        cls_spec = list(cls_spec)
+
+        if rs._tables == Undef:
+            rs._tables = [i for i in cls_spec]
+
+        cls_alias_per_ref = {}
+        for ref in args:
+            rel = ref._relation
+            cls_alias = ClassAlias(rel.remote_cls)
+            left_condition = self._get_left_join_condition(rel, cls_alias)
+            rs._tables.append(LeftJoin(cls_alias, left_condition))
+            cls_spec.append(cls_alias)
+            cls_alias_per_ref[ref] = cls_alias
+
+        # Rebuild find spec
+        rs._find_spec = FindSpec(tuple(cls_spec))
+        rs._find_spec._to_select_related = cls_alias_per_ref
+        return rs
+
+
+# Replace resultset by mine
+Store._result_set_factory = ResultSet
+
+# Replace FindSpec by mine on module level
+import storm.store
+storm.store.FindSpec = FindSpec
+
+# Replace the references from storm.Store that uses FindSpec
+original_find = Store.find
+
+def _find(self, cls_spec, *args, **kwargs):
+    resultset = original_find(self, cls_spec, *args, **kwargs)
+    resultset._find_spec = FindSpec(cls_spec)
+    return resultset
+
+Store.find = _find
diff --git a/lib/lp/services/database/tests/test_select_related.py b/lib/lp/services/database/tests/test_select_related.py
new file mode 100644
index 0000000..55860f9
--- /dev/null
+++ b/lib/lp/services/database/tests/test_select_related.py
@@ -0,0 +1,57 @@
+from lp.bugs.model.bug import Bug
+from lp.services.database.select_related import ResultSet
+from lp.testing import TestCaseWithFactory, StormStatementRecorder, login
+from lp.testing.layers import DatabaseFunctionalLayer
+from lp.testing.matchers import HasQueryCount
+from lp.testing.sampledata import ADMIN_EMAIL
+from storm.store import Store
+from testtools.matchers import Equals
+from zope.security.proxy import removeSecurityProxy
+
+
+class TestLoaders(TestCaseWithFactory):
+    layer = DatabaseFunctionalLayer
+
+    def test_select_related_object_does_left_join(self):
+        # we have 11 bugs of the same owner
+        duplicate = self.factory.makeBug()
+        store = Store.of(duplicate)
+        bug1 = self.factory.makeBug()  # this bug doesnt have dup
+        for i in range(10):
+            b = removeSecurityProxy(self.factory.makeBug(owner=bug1.owner))
+            # XXX: Test with left join
+            b.duplicateof = duplicate
+            store.add(duplicate)
+
+        # and 2 other random bugs
+        self.factory.makeBug()
+        self.factory.makeBug()
+
+        store.flush()
+        store.commit()
+
+        bugs_filter = Bug.owner == bug1.owner
+        store.invalidate()
+        # fetch bug1 again outside recorder, to do the match agains it bellow
+        bug1.owner
+        with StormStatementRecorder() as recorder:
+            rs = store.find(Bug, bugs_filter)
+            bugs = rs.select_related(
+                Bug.owner, Bug.who_made_private, Bug.duplicateof)
+
+            # One query for the count
+            self.assertEqual(11, bugs.count())
+
+            # This should trigger only one query to database
+            # (even if we are fetching related objects)
+            for bug in bugs:
+                bug = removeSecurityProxy(bug)
+                self.assertIsNotNone(bug.date_last_message)
+                self.assertEqual(bug.owner.name, bug1.owner.name)
+                self.assertIsNone(bug.who_made_private)
+                if bug.id == bug1.id:
+                    self.assertIsNone(bug.duplicateof)
+                else:
+                    self.assertIsNotNone(bug.duplicateof)
+
+        self.assertThat(recorder, HasQueryCount(Equals(2)))
\ No newline at end of file
diff --git a/lib/lp/soyuz/browser/queue.py b/lib/lp/soyuz/browser/queue.py
index a476e1d..a036f11 100644
--- a/lib/lp/soyuz/browser/queue.py
+++ b/lib/lp/soyuz/browser/queue.py
@@ -1,4 +1,4 @@
-# Copyright 2009-2017 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2020 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Browser views for package queue."""
@@ -10,6 +10,7 @@ __all__ = [
     'QueueItemsView',
     ]
 
+from collections import defaultdict
 from operator import attrgetter
 
 from lazr.delegates import delegate_to
@@ -68,6 +69,7 @@ from lp.soyuz.model.files import (
 from lp.soyuz.model.packagecopyjob import PackageCopyJob
 from lp.soyuz.model.queue import (
     PackageUploadBuild,
+    PackageUploadLog,
     PackageUploadSource,
     )
 from lp.soyuz.model.sourcepackagerelease import SourcePackageRelease
@@ -207,7 +209,23 @@ class QueueItemsView(LaunchpadView):
         jobs = load_related(Job, package_copy_jobs, ['job_id'])
         person_ids.extend(map(attrgetter('requester_id'), jobs))
         list(getUtility(IPersonSet).getPrecachedPersonsFromIDs(
-            person_ids, need_validity=True, need_icon=True))
+            person_ids, need_validity=True))
+
+    def getPreloadedLogsDict(self, uploads):
+        """Returns a dict of preloaded PackageUploadLog objects by
+        package_upload_id, to be used on preloading CompletePackageUpload
+        objects"""
+        logs = load_referencing(
+            PackageUploadLog, uploads, ['package_upload_id'])
+        logs.sort(key=attrgetter("date_created"), reverse=True)
+        list(getUtility(IPersonSet).getPrecachedPersonsFromIDs(
+            [log.reviewer_id for log in logs],
+            need_validity=True
+        ))
+        logs_dict = defaultdict(list)
+        for log in logs:
+            logs_dict[log.package_upload_id].append(log)
+        return logs_dict
 
     def decoratedQueueBatch(self):
         """Return the current batch, converted to decorated objects.
@@ -227,6 +245,9 @@ class QueueItemsView(LaunchpadView):
         pubs = load_referencing(
             PackageUploadBuild, uploads, ['packageuploadID'])
 
+        # preload logs info
+        logs_dict = self.getPreloadedLogsDict(uploads)
+
         source_sprs = load_related(
             SourcePackageRelease, puses, ['sourcepackagereleaseID'])
         bpbs = load_related(BinaryPackageBuild, pubs, ['buildID'])
@@ -264,7 +285,8 @@ class QueueItemsView(LaunchpadView):
 
         return [
             CompletePackageUpload(
-                item, build_upload_files, source_upload_files, package_sets)
+                item, build_upload_files, source_upload_files, package_sets,
+                logs_dict[item.id])
             for item in uploads]
 
     def is_new(self, binarypackagerelease):
@@ -493,13 +515,14 @@ class CompletePackageUpload:
     date_created = None
     sources = None
     builds = None
+    logs = None
     customfiles = None
     contains_source = None
     contains_build = None
     sourcepackagerelease = None
 
     def __init__(self, packageupload, build_upload_files,
-                 source_upload_files, package_sets):
+                 source_upload_files, package_sets, logs=None):
         self.pocket = packageupload.pocket
         self.date_created = packageupload.date_created
         self.context = packageupload
@@ -507,6 +530,7 @@ class CompletePackageUpload:
         self.contains_source = len(self.sources) > 0
         self.builds = list(packageupload.builds)
         self.contains_build = len(self.builds) > 0
+        self.logs = list(logs) if logs is not None else None
         self.customfiles = list(packageupload.customfiles)
 
         # Create a dictionary of binary files keyed by
diff --git a/lib/lp/soyuz/browser/tests/test_queue.py b/lib/lp/soyuz/browser/tests/test_queue.py
index c2d15f2..0990845 100644
--- a/lib/lp/soyuz/browser/tests/test_queue.py
+++ b/lib/lp/soyuz/browser/tests/test_queue.py
@@ -1,4 +1,4 @@
-# Copyright 2010-2018 Canonical Ltd.  This software is licensed under the
+# Copyright 2010-2020 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Unit tests for QueueItemsView."""
@@ -33,6 +33,7 @@ from lp.testing import (
     login_person,
     logout,
     person_logged_in,
+    record_two_runs,
     StormStatementRecorder,
     TestCaseWithFactory,
     )
@@ -342,10 +343,10 @@ class TestQueueItemsView(TestCaseWithFactory):
 
     layer = LaunchpadFunctionalLayer
 
-    def makeView(self, distroseries, user):
+    def makeView(self, distroseries, user, form=None):
         """Create a queue view."""
         return create_initialized_view(
-            distroseries, name='+queue', principal=user)
+            distroseries, name='+queue', principal=user, form=form)
 
     def test_view_renders_source_upload(self):
         login(ADMIN_EMAIL)
@@ -437,7 +438,41 @@ class TestQueueItemsView(TestCaseWithFactory):
             with StormStatementRecorder() as recorder:
                 view = self.makeView(distroseries, queue_admin)
                 view()
-        self.assertThat(recorder, HasQueryCount(Equals(56)))
+        self.assertThat(recorder, HasQueryCount(Equals(57)))
+
+    def test_package_upload_with_logs_query_count(self):
+        login(ADMIN_EMAIL)
+        uploads = []
+        distroseries = self.factory.makeDistroSeries()
+
+        for i in range(50):
+            uploads.append(self.factory.makeSourcePackageUpload(distroseries))
+        queue_admin = self.factory.makeArchiveAdmin(distroseries.main_archive)
+
+        def reject_some_package():
+            for upload in uploads:
+                if len(upload.logs) == 0:
+                    person = self.factory.makePerson()
+                    upload.rejectFromQueue(person)
+                    break
+
+        def run_view():
+            Store.of(uploads[0]).invalidate()
+            with person_logged_in(queue_admin):
+                form = {
+                    "queue_state": PackageUploadStatus.REJECTED.value}
+                view = self.makeView(distroseries, queue_admin, form=form)
+                view()
+
+        recorder1, recorder2 = record_two_runs(
+            run_view, reject_some_package, 1, 10)
+        # XXX: The query count should be the same, but there is one query
+        # fetching PersonLocation that only happens in the first run (no
+        # matter how many times we create anything), and it's not related at
+        # all to the logs.
+        # self.assertThat(recorder2, HasQueryCount.byEquality(recorder1))
+        self.assertThat(recorder1, HasQueryCount(Equals(37)))
+        self.assertThat(recorder2, HasQueryCount(Equals(36)))
 
 
 class TestCompletePackageUpload(TestCaseWithFactory):
diff --git a/lib/lp/soyuz/configure.zcml b/lib/lp/soyuz/configure.zcml
index 0e35ccb..f159d27 100644
--- a/lib/lp/soyuz/configure.zcml
+++ b/lib/lp/soyuz/configure.zcml
@@ -1,4 +1,4 @@
-<!-- Copyright 2009-2019 Canonical Ltd.  This software is licensed under the
+<!-- Copyright 2009-2020 Canonical Ltd.  This software is licensed under the
      GNU Affero General Public License version 3 (see the file LICENSE).
 -->
 
@@ -151,6 +151,7 @@
                 displayarchs
                 displayversion
                 isPPA
+                logs
                 package_copy_job
                 package_copy_job_id
                 package_name
@@ -183,6 +184,11 @@
             set_attributes="status distroseries pocket changesfile archive"/>
     </class>
     <class
+        class="lp.soyuz.model.queue.PackageUploadLog">
+        <allow
+            interface="lp.soyuz.interfaces.queue.IPackageUploadLog"/>
+    </class>
+    <class
         class="lp.soyuz.model.queue.PackageUploadSource">
         <allow
             interface="lp.soyuz.interfaces.queue.IPackageUploadSource"/>
diff --git a/lib/lp/soyuz/interfaces/queue.py b/lib/lp/soyuz/interfaces/queue.py
index c57da73..6f440b3 100644
--- a/lib/lp/soyuz/interfaces/queue.py
+++ b/lib/lp/soyuz/interfaces/queue.py
@@ -1,4 +1,4 @@
-# Copyright 2009-2016 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2020 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 """Queue interfaces."""
@@ -11,6 +11,7 @@ __all__ = [
     'IHasQueueItems',
     'IPackageUploadQueue',
     'IPackageUpload',
+    'IPackageUploadLog',
     'IPackageUploadBuild',
     'IPackageUploadSource',
     'IPackageUploadCustom',
@@ -37,6 +38,7 @@ from lazr.restful.declarations import (
     REQUEST_USER,
     )
 from lazr.restful.fields import Reference
+from twisted.words.im.interfaces import IPerson
 from zope.interface import (
     Attribute,
     Interface,
@@ -149,6 +151,8 @@ class IPackageUpload(Interface):
             title=_('Date created'),
             description=_("The date this package upload was done.")))
 
+    logs = Attribute(_("The change log of this PackageUpload."))
+
     changesfile = Attribute("The librarian alias for the changes file "
                             "associated with this upload")
     changes_file_url = exported(
@@ -711,6 +715,33 @@ class IPackageUploadCustom(Interface):
         """
 
 
+class IPackageUploadLog(Interface):
+    id = Int(title=_('ID'), required=True, readonly=True)
+
+    package_upload = Reference(
+        IPackageUpload,
+        title=_("Original package upload."), required=True, readonly=True)
+
+    date_created = Datetime(
+        title=_("When this action happened."), required=True, readonly=True)
+
+    reviewer = Reference(
+        IPerson, title=_("Who did this action."),
+        required=True, readonly=True)
+
+    old_status = Choice(
+        vocabulary=PackageUploadStatus, description=_("Old status."),
+        required=True, readonly=True)
+
+    new_status = Choice(
+        vocabulary=PackageUploadStatus, description=_("New status."),
+        required=True, readonly=True)
+
+    comment = TextLine(
+        title=_("User's comment about this change."),
+        required=False, readonly=True)
+
+
 class IPackageUploadSet(Interface):
     """Represents a set of IPackageUploads"""
 
diff --git a/lib/lp/soyuz/model/queue.py b/lib/lp/soyuz/model/queue.py
index 0fd1799..d394e9e 100644
--- a/lib/lp/soyuz/model/queue.py
+++ b/lib/lp/soyuz/model/queue.py
@@ -1,18 +1,21 @@
-# Copyright 2009-2018 Canonical Ltd.  This software is licensed under the
+# Copyright 2009-2020 Canonical Ltd.  This software is licensed under the
 # GNU Affero General Public License version 3 (see the file LICENSE).
 
 __metaclass__ = type
 __all__ = [
-    'PackageUploadQueue',
     'PackageUpload',
     'PackageUploadBuild',
-    'PackageUploadSource',
     'PackageUploadCustom',
+    'PackageUploadLog',
+    'PackageUploadLog',
+    'PackageUploadQueue',
     'PackageUploadSet',
+    'PackageUploadSource',
     ]
 
 from itertools import chain
 
+import pytz
 from sqlobject import (
     ForeignKey,
     SQLMultipleJoin,
@@ -29,6 +32,7 @@ from storm.locals import (
     SQL,
     Unicode,
     )
+from storm.properties import DateTime
 from storm.store import (
     EmptyResultSet,
     Store,
@@ -49,7 +53,10 @@ from lp.services.database.bulk import (
 from lp.services.database.constants import UTC_NOW
 from lp.services.database.datetimecol import UtcDateTimeCol
 from lp.services.database.decoratedresultset import DecoratedResultSet
-from lp.services.database.enumcol import EnumCol
+from lp.services.database.enumcol import (
+    DBEnum,
+    EnumCol,
+    )
 from lp.services.database.interfaces import (
     IMasterStore,
     IStore,
@@ -58,6 +65,7 @@ from lp.services.database.sqlbase import (
     SQLBase,
     sqlvalues,
     )
+from lp.services.database.stormbase import StormBase
 from lp.services.database.stormexpr import (
     Array,
     ArrayContains,
@@ -97,6 +105,7 @@ from lp.soyuz.interfaces.queue import (
     IPackageUpload,
     IPackageUploadBuild,
     IPackageUploadCustom,
+    IPackageUploadLog,
     IPackageUploadQueue,
     IPackageUploadSet,
     IPackageUploadSource,
@@ -106,7 +115,7 @@ from lp.soyuz.interfaces.queue import (
     QueueInconsistentStateError,
     QueueSourceAcceptError,
     QueueStateWriteProtectedError,
-    )
+    IPackageUploadLog)
 from lp.soyuz.interfaces.section import ISectionSet
 from lp.soyuz.mail.packageupload import PackageUploadMailer
 from lp.soyuz.model.binarypackagename import BinaryPackageName
@@ -203,6 +212,26 @@ class PackageUpload(SQLBase):
         if self.package_copy_job:
             self.addSearchableNames([self.package_copy_job.package_name])
             self.addSearchableVersions([self.package_copy_job.package_version])
+        self._logs = None
+
+    @property
+    def logs(self):
+        if self._logs is None:
+            logs = Store.of(self).find(
+                PackageUploadLog,
+                PackageUploadLog.package_upload == self)
+            self._logs = list(logs.order_by(Desc('date_created')))
+        return self._logs
+
+    def _addLog(self, reviewer, new_status, comment=None):
+        self._logs = None  # clear local cache
+        return Store.of(self).add(PackageUploadLog(
+            package_upload=self,
+            reviewer=reviewer,
+            old_status=self.status,
+            new_status=new_status,
+            comment=comment
+        ))
 
     @cachedproperty
     def sources(self):
@@ -571,6 +600,10 @@ class PackageUpload(SQLBase):
 
     def acceptFromQueue(self, user=None):
         """See `IPackageUpload`."""
+        # XXX: Only tests are not passing user here. We should adjust the
+        # tests and always create the log entries after
+        if user is not None:
+            self._addLog(user, PackageUploadStatus.ACCEPTED, None)
         if self.package_copy_job is None:
             self._acceptNonSyncFromQueue()
         else:
@@ -581,6 +614,7 @@ class PackageUpload(SQLBase):
 
     def rejectFromQueue(self, user, comment=None):
         """See `IPackageUpload`."""
+        self._addLog(user, PackageUploadStatus.REJECTED, comment)
         self.setRejected()
         if self.package_copy_job is not None:
             # Circular imports :(
@@ -1153,6 +1187,46 @@ def get_properties_for_binary(bpr):
         }
 
 
+@implementer(IPackageUploadLog)
+class PackageUploadLog(StormBase):
+    """Tracking of status changes for a given package upload"""
+
+    __storm_table__ = "PackageUploadLog"
+
+    id = Int(primary=True)
+
+    package_upload_id = Int(name='package_upload')
+    package_upload = Reference(package_upload_id, PackageUpload.id)
+
+    date_created = DateTime(tzinfo=pytz.UTC, allow_none=False,
+                            default=UTC_NOW)
+
+    reviewer_id = Int(name='reviewer', allow_none=False)
+    reviewer = Reference(reviewer_id, 'Person.id')
+
+    old_status = DBEnum(enum=PackageUploadStatus, allow_none=False)
+
+    new_status = DBEnum(enum=PackageUploadStatus, allow_none=False)
+
+    comment = Unicode(allow_none=True)
+
+    def __init__(self, package_upload, reviewer, old_status, new_status,
+                 comment=None, date_created=None):
+        self.package_upload = package_upload
+        self.reviewer = reviewer
+        self.old_status = old_status
+        self.new_status = new_status
+        if comment is not None:
+            self.comment = comment
+        if date_created is not None:
+            self.date_created = date_created
+
+    def __repr__(self):
+        return (
+            "<{self.__class__.__name__} ~{self.reviewer.name} "
+            "changed {self.package_upload} to {self.new_status} "
+            "{self.date_created}>").format(self=self)
+
 @implementer(IPackageUploadBuild)
 class PackageUploadBuild(SQLBase):
     """A Queue item's related builds."""
diff --git a/lib/lp/soyuz/stories/soyuz/xx-queue-pages.txt b/lib/lp/soyuz/stories/soyuz/xx-queue-pages.txt
index 7417a7b..8dcd71a 100644
--- a/lib/lp/soyuz/stories/soyuz/xx-queue-pages.txt
+++ b/lib/lp/soyuz/stories/soyuz/xx-queue-pages.txt
@@ -510,7 +510,7 @@ values:
     >>> filelist = find_tags_by_class(anon_browser.contents, 'queue-2')
     >>> for row in filelist:
     ...     print(extract_text(row))
-    pmount_1.0-1_all.deb (18 bytes) NEW 0.1-1 restricted admin extra
+    pmount_1.0-1_all.deb (18 bytes) NEW 0.1-1 restricted admin extra Accepted a moment ago by Sample Person
 
 'netapplet' has gone straight to the 'done' queue because it's a single
 source upload, and we can see its overridden values there:
@@ -546,6 +546,15 @@ Rejecting 'alsa-utils' source:
     netapplet...ddtp... -                                           Release 2006-...
     netapplet...dist... -                                           Release 2006-...
 
+    >>> upload_manager_browser.getControl(
+    ...     name="queue_state", index=0).displayValue=['Rejected']
+    >>> upload_manager_browser.getControl("Update").click()
+    >>> logs = find_tags_by_class(
+    ...     upload_manager_browser.contents, "log-content")
+    >>> for log in logs:
+    ...     print(extract_text(log))
+    Rejected...a moment ago...by Sample Person...Foo
+
 One rejection email is generated:
 
     >>> run_package_upload_notification_jobs()
@@ -599,6 +608,21 @@ button will be disabled.
     >>> upload_manager_browser.getControl(name="Reject").disabled
     True
 
+Accepting alsa again, and check that the package upload log has more rows
+
+    >>> upload_manager_browser.getControl(name="QUEUE_ID").value = ['4']
+    >>> upload_manager_browser.getControl(name="Accept").click()
+    >>> upload_manager_browser.getControl(
+    ...     name="queue_state", index=0).displayValue=['Accepted']
+    >>> upload_manager_browser.getControl("Update").click()
+    >>> pkg_content = first_tag_by_class(upload_manager_browser.contents,
+    ...                                  "queue-4")
+    >>> logs = find_tags_by_class(str(pkg_content), "log-content")
+    >>> for log in logs:
+    ...     print(extract_text(log))
+    Accepted...a moment ago...by Sample Person...
+    Rejected...a moment ago...by Sample Person...Foo
+
 
 Clean up
 ========
diff --git a/lib/lp/soyuz/templates/distroseries-queue.pt b/lib/lp/soyuz/templates/distroseries-queue.pt
index 59b77d5..39629ce 100644
--- a/lib/lp/soyuz/templates/distroseries-queue.pt
+++ b/lib/lp/soyuz/templates/distroseries-queue.pt
@@ -149,6 +149,16 @@
            </tbody>
            <tbody tal:attributes="class string:${filelist_class}">
              <metal:filelist use-macro="template/macros/package-filelist"/>
+             <tr class="log-content" tal:repeat="log packageupload/logs">
+               <td colspan="2" style="border: 0"></td>
+               <td colspan="8" style="border: 0">
+                 <span tal:content="log/new_status"></span>
+                 <span tal:attributes="title log/date_created/fmt:datetime"
+                       tal:content="log/date_created/fmt:displaydate" />
+                 by <span tal:content="structure log/reviewer/fmt:link" />
+                 <p tal:condition="log/comment" tal:content="log/comment" />
+               </td>
+             </tr>
            </tbody>
          </tal:block>
        </tal:batch>
diff --git a/lib/lp/soyuz/tests/test_packageupload.py b/lib/lp/soyuz/tests/test_packageupload.py
index cb4d038..a2eb80f 100644
--- a/lib/lp/soyuz/tests/test_packageupload.py
+++ b/lib/lp/soyuz/tests/test_packageupload.py
@@ -16,7 +16,10 @@ from lazr.restfulclient.errors import (
     BadRequest,
     Unauthorized,
     )
-from testtools.matchers import Equals
+from testtools.matchers import (
+    Equals,
+    MatchesStructure,
+    )
 import transaction
 from zope.component import (
     getUtility,
@@ -91,6 +94,27 @@ class PackageUploadTestCase(TestCaseWithFactory):
         if os.path.exists(config.personalpackagearchive.root):
             shutil.rmtree(config.personalpackagearchive.root)
 
+    def test_add_log_entry(self):
+        upload = self.factory.makePackageUpload(
+            status=PackageUploadStatus.UNAPPROVED)
+        upload = removeSecurityProxy(upload)
+        self.assertEqual(0, len(upload.logs))
+
+        person = self.factory.makePerson(name='lpusername')
+
+        upload._addLog(person, PackageUploadStatus.REJECTED, 'just because')
+
+        log = upload.logs[0]
+        self.assertThat(log, MatchesStructure.byEquality(
+            reviewer=person, old_status=PackageUploadStatus.UNAPPROVED,
+            new_status=PackageUploadStatus.REJECTED, comment='just because'))
+
+        expected_repr = (
+            "<PackageUploadLog ~lpusername "
+            "changed {self.package_upload} to Rejected "
+            "{self.date_created}>").format(self=log)
+        self.assertEqual(str(expected_repr), repr(log))
+
     def test_realiseUpload_for_overridden_component_archive(self):
         # If the component of an upload is overridden to 'Partner' for
         # example, then the new publishing record should be for the
@@ -358,8 +382,14 @@ class PackageUploadTestCase(TestCaseWithFactory):
 
         # Accepting one of them works.  (Since it's a single source upload,
         # it goes straight to DONE.)
-        upload_one.acceptFromQueue()
+        person = self.factory.makePerson()
+        upload_one.acceptFromQueue(person)
         self.assertEqual("DONE", upload_one.status.name)
+
+        log = upload_one.logs[0]
+        self.assertThat(log, MatchesStructure.byEquality(
+            reviewer=person, old_status=PackageUploadStatus.UNAPPROVED,
+            new_status=PackageUploadStatus.ACCEPTED, comment=None))
         transaction.commit()
 
         # Trying to accept the second fails.
@@ -368,9 +398,15 @@ class PackageUploadTestCase(TestCaseWithFactory):
         self.assertEqual("UNAPPROVED", upload_two.status.name)
 
         # Rejecting the second upload works.
-        upload_two.rejectFromQueue(self.factory.makePerson())
+        upload_two.rejectFromQueue(person, 'Because yes')
         self.assertEqual("REJECTED", upload_two.status.name)
 
+        self.assertEqual(1, len(upload_two.logs))
+        log = upload_two.logs[0]
+        self.assertThat(log, MatchesStructure.byEquality(
+            reviewer=person, old_status=PackageUploadStatus.UNAPPROVED,
+            new_status=PackageUploadStatus.REJECTED, comment='Because yes'))
+
     def test_rejectFromQueue_source_sends_email(self):
         # Rejecting a source package sends an email to the uploader.
         self.test_publisher.prepareBreezyAutotest()

Follow ups