← Back to team overview

launchpad-reviewers team mailing list archive

[Merge] lp:~wallyworld/launchpad/ppa-packages-timeout-1071581 into lp:launchpad

 

Ian Booth has proposed merging lp:~wallyworld/launchpad/ppa-packages-timeout-1071581 into lp:launchpad.

Commit message:
Introduce a new reporting table (populated by a frequent garbo job) to allow efficient loading of source package releases related to a person.

Requested reviews:
  Curtis Hovey (sinzui): code
Related bugs:
  Bug #1071581 in Launchpad itself: "+ppa-packages timeout"
  https://bugs.launchpad.net/launchpad/+bug/1071581

For more details, see:
https://code.launchpad.net/~wallyworld/launchpad/ppa-packages-timeout-1071581/+merge/132646

== Pre Implementation ==

Discussed with wgrant.

== Implementation ==

So Postgres cannot efficiently execute the required DISTINCT ON query needed to only find the latest published packages.
After a previous attempt to solve this, the best approach is to introduce a new denormalised reporting table and populate the table with a garbo job. The data does not have to be super up-to-date, so a frequently run garbo job will suffice.

This mp contains a few parts:

1. New model class LatestPersonSourcepackageReleaseCache (rows from the denormalised reporting table)

2. Tweaks to the view code and person model to get data from the new denormed table. A huge hunk of slow SQL is replaced by a simple select.

3. Garbo job to populate the new reporting table

4. New sample data for the existing model data

The denormalised table contains data rows keyed on both package creators and maintainers. This allows the views to filter on either of these as required.

The garbo job maintains state between runs in a new generic GarboJobState table. This replaces the use of memcache which is now deprecated. The job first does the creators and then the maintainers.

This mp requires the corresponding db-devel branch to land in order for the tables to be created.

== Tests ==

Add a new test for the new garbo job.

== Lint ==

Checking for conflicts and issues in changed files.

Linting changed files:
  database/schema/security.cfg
  lib/lp/registry/browser/person.py
  lib/lp/registry/model/person.py
  lib/lp/scripts/garbo.py
  lib/lp/soyuz/configure.zcml
  lib/lp/soyuz/interfaces/reporting.py
  lib/lp/soyuz/model/reporting.py
-- 
https://code.launchpad.net/~wallyworld/launchpad/ppa-packages-timeout-1071581/+merge/132646
Your team Launchpad code reviewers is subscribed to branch lp:launchpad.
=== modified file 'database/sampledata/current-dev.sql'
--- database/sampledata/current-dev.sql	2012-10-24 23:57:38 +0000
+++ database/sampledata/current-dev.sql	2012-11-05 11:37:22 +0000
@@ -4348,6 +4348,13 @@
 
 
 
+ALTER TABLE garbojobstate DISABLE TRIGGER ALL;
+
+
+
+ALTER TABLE garbojobstate ENABLE TRIGGER ALL;
+
+
 ALTER TABLE hwvendorname DISABLE TRIGGER ALL;
 
 INSERT INTO hwvendorname (id, name) VALUES (1, 'MSI');
@@ -4829,6 +4836,46 @@
 ALTER TABLE languagepack ENABLE TRIGGER ALL;
 
 
+ALTER TABLE latestpersonsourcepackagereleasecache DISABLE TRIGGER ALL;
+
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (116, 1, '2004-09-27 11:57:13', 1, NULL, 1, 1, 1, 1, 14);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (122, 5, '2005-03-10 16:30:00', 1, NULL, 1, 1, 3, 10, 16);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (123, 4, '2005-03-14 18:00:00', 1, NULL, 1, 1, 1, 10, 17);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (124, 9, '2005-06-24 20:59:31.439579', 1, NULL, 1, 1, 3, 14, 21);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (125, 10, '2005-02-03 08:50:00', 1, NULL, 1, 1, 1, 19, 23);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (126, 15, '2005-07-01 22:47:00', 1, NULL, 1, 1, 8, 19, 24);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (127, 12, '2005-08-01 14:10:00', 1, NULL, 1, 1, 3, 19, 25);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (128, 13, '2005-12-22 18:19:00', 1, NULL, 1, 1, 3, 20, 26);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (129, 16, '2006-02-10 11:19:00', 1, NULL, 1, 1, 3, 21, 27);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (130, 19, '2005-12-22 18:19:00', 1, NULL, 1, 1, 3, 22, 28);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (131, 22, '2006-12-01 13:19:01', 1, NULL, 1, 1, 10, 23, 32);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (132, 23, '2006-12-01 13:19:01', 1, NULL, 1, 1, 10, 24, 33);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (133, 31, '2006-04-11 11:19:01', 1, NULL, 2, 10, 1, 26, 35);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (134, 30, '2007-08-09 21:25:37.832976', 243606, NULL, 4, 12, 10, 27, 36);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (135, 33, '2005-08-01 14:10:00', 1, NULL, 1, 1, 3, 9, 39);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (136, 35, '2006-07-05 14:10:00', 1, NULL, 1, 1, 3, 1, 41);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (137, 1, '2004-09-27 11:57:13', NULL, 1, 1, 1, 1, 1, 14);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (138, 2, '2004-09-27 11:57:13', NULL, 1, 1, 1, 3, 9, 15);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (139, 5, '2005-03-10 16:30:00', NULL, 1, 1, 1, 3, 10, 16);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (140, 4, '2005-03-14 18:00:00', NULL, 1, 1, 1, 1, 10, 17);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (141, 9, '2005-06-24 20:59:31.439579', NULL, 1, 1, 1, 3, 14, 21);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (142, 10, '2005-02-03 08:50:00', NULL, 1, 1, 1, 1, 19, 23);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (143, 14, '2005-07-01 22:47:00', NULL, 1, 1, 1, 8, 19, 24);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (144, 20, '2005-08-01 14:10:00', NULL, 16, 1, 1, 3, 19, 25);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (145, 13, '2005-12-22 18:19:00', NULL, 16, 1, 1, 3, 20, 26);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (146, 16, '2006-02-10 11:19:00', NULL, 16, 1, 1, 3, 21, 27);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (147, 17, '2005-12-22 18:19:00', NULL, 16, 1, 1, 3, 22, 28);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (148, 22, '2006-12-01 13:19:01', NULL, 16, 1, 1, 10, 23, 32);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (149, 25, '2006-12-01 13:19:01', NULL, 16, 1, 1, 10, 24, 33);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (150, 31, '2006-04-11 11:19:01', NULL, 16, 2, 10, 1, 26, 35);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (151, 30, '2007-08-09 21:25:37.832976', NULL, 243606, 4, 12, 10, 27, 36);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (152, 33, '2005-08-01 14:10:00', NULL, 16, 1, 1, 3, 9, 39);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (153, 35, '2006-07-05 14:10:00', NULL, 16, 1, 1, 3, 1, 41);
+
+
+ALTER TABLE latestpersonsourcepackagereleasecache ENABLE TRIGGER ALL;
+
+
 ALTER TABLE launchpaddatabaserevision DISABLE TRIGGER ALL;
 
 

=== modified file 'database/sampledata/current.sql'
--- database/sampledata/current.sql	2012-10-24 23:57:38 +0000
+++ database/sampledata/current.sql	2012-11-05 11:37:22 +0000
@@ -4275,6 +4275,13 @@
 
 
 
+ALTER TABLE garbojobstate DISABLE TRIGGER ALL;
+
+
+
+ALTER TABLE garbojobstate ENABLE TRIGGER ALL;
+
+
 ALTER TABLE hwvendorname DISABLE TRIGGER ALL;
 
 INSERT INTO hwvendorname (id, name) VALUES (1, 'MSI');
@@ -4756,6 +4763,46 @@
 ALTER TABLE languagepack ENABLE TRIGGER ALL;
 
 
+ALTER TABLE latestpersonsourcepackagereleasecache DISABLE TRIGGER ALL;
+
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (116, 1, '2004-09-27 11:57:13', 1, NULL, 1, 1, 1, 1, 14);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (122, 5, '2005-03-10 16:30:00', 1, NULL, 1, 1, 3, 10, 16);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (123, 4, '2005-03-14 18:00:00', 1, NULL, 1, 1, 1, 10, 17);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (124, 9, '2005-06-24 20:59:31.439579', 1, NULL, 1, 1, 3, 14, 21);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (125, 10, '2005-02-03 08:50:00', 1, NULL, 1, 1, 1, 19, 23);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (126, 15, '2005-07-01 22:47:00', 1, NULL, 1, 1, 8, 19, 24);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (127, 12, '2005-08-01 14:10:00', 1, NULL, 1, 1, 3, 19, 25);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (128, 13, '2005-12-22 18:19:00', 1, NULL, 1, 1, 3, 20, 26);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (129, 16, '2006-02-10 11:19:00', 1, NULL, 1, 1, 3, 21, 27);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (130, 19, '2005-12-22 18:19:00', 1, NULL, 1, 1, 3, 22, 28);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (131, 22, '2006-12-01 13:19:01', 1, NULL, 1, 1, 10, 23, 32);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (132, 23, '2006-12-01 13:19:01', 1, NULL, 1, 1, 10, 24, 33);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (133, 31, '2006-04-11 11:19:01', 1, NULL, 2, 10, 1, 26, 35);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (134, 30, '2007-08-09 21:25:37.832976', 243606, NULL, 4, 12, 10, 27, 36);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (135, 33, '2005-08-01 14:10:00', 1, NULL, 1, 1, 3, 9, 39);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (136, 35, '2006-07-05 14:10:00', 1, NULL, 1, 1, 3, 1, 41);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (137, 1, '2004-09-27 11:57:13', NULL, 1, 1, 1, 1, 1, 14);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (138, 2, '2004-09-27 11:57:13', NULL, 1, 1, 1, 3, 9, 15);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (139, 5, '2005-03-10 16:30:00', NULL, 1, 1, 1, 3, 10, 16);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (140, 4, '2005-03-14 18:00:00', NULL, 1, 1, 1, 1, 10, 17);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (141, 9, '2005-06-24 20:59:31.439579', NULL, 1, 1, 1, 3, 14, 21);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (142, 10, '2005-02-03 08:50:00', NULL, 1, 1, 1, 1, 19, 23);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (143, 14, '2005-07-01 22:47:00', NULL, 1, 1, 1, 8, 19, 24);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (144, 20, '2005-08-01 14:10:00', NULL, 16, 1, 1, 3, 19, 25);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (145, 13, '2005-12-22 18:19:00', NULL, 16, 1, 1, 3, 20, 26);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (146, 16, '2006-02-10 11:19:00', NULL, 16, 1, 1, 3, 21, 27);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (147, 17, '2005-12-22 18:19:00', NULL, 16, 1, 1, 3, 22, 28);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (148, 22, '2006-12-01 13:19:01', NULL, 16, 1, 1, 10, 23, 32);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (149, 25, '2006-12-01 13:19:01', NULL, 16, 1, 1, 10, 24, 33);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (150, 31, '2006-04-11 11:19:01', NULL, 16, 2, 10, 1, 26, 35);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (151, 30, '2007-08-09 21:25:37.832976', NULL, 243606, 4, 12, 10, 27, 36);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (152, 33, '2005-08-01 14:10:00', NULL, 16, 1, 1, 3, 9, 39);
+INSERT INTO latestpersonsourcepackagereleasecache (id, publication, date_uploaded, creator, maintainer, archive_purpose, upload_archive, upload_distroseries, sourcepackagename, sourcepackagerelease) VALUES (153, 35, '2006-07-05 14:10:00', NULL, 16, 1, 1, 3, 1, 41);
+
+
+ALTER TABLE latestpersonsourcepackagereleasecache ENABLE TRIGGER ALL;
+
+
 ALTER TABLE launchpaddatabaserevision DISABLE TRIGGER ALL;
 
 

=== modified file 'database/schema/security.cfg'
--- database/schema/security.cfg	2012-11-01 19:26:37 +0000
+++ database/schema/security.cfg	2012-11-05 11:37:22 +0000
@@ -217,6 +217,7 @@
 public.karmatotalcache                  = SELECT, DELETE, UPDATE
 public.language                         = SELECT
 public.languagepack                     = SELECT, INSERT, UPDATE
+public.latestpersonsourcepackagereleasecache = SELECT
 public.launchpadstatistic               = SELECT
 public.libraryfilealias                 = SELECT, INSERT, UPDATE, DELETE
 public.libraryfiledownloadcount         = SELECT, INSERT, UPDATE
@@ -2239,8 +2240,10 @@
 public.codeimportresult                 = SELECT, DELETE
 public.commercialsubscription           = SELECT, UPDATE
 public.emailaddress                     = SELECT, UPDATE, DELETE
+public.garbojobstate                    = SELECT, INSERT, UPDATE, DELETE
 public.hwsubmission                     = SELECT, UPDATE
 public.job                              = SELECT, INSERT, DELETE
+public.latestpersonsourcepackagereleasecache = SELECT, INSERT, UPDATE
 public.logintoken                       = SELECT, DELETE
 public.mailinglistsubscription          = SELECT, DELETE
 public.milestonetag                     = SELECT

=== modified file 'lib/lp/registry/browser/person.py'
--- lib/lp/registry/browser/person.py	2012-11-01 03:41:36 +0000
+++ lib/lp/registry/browser/person.py	2012-11-05 11:37:22 +0000
@@ -3644,17 +3644,19 @@
         builds_by_package = {}
         needs_build_by_package = {}
         for package in package_releases:
-            builds_by_package[package] = []
-            needs_build_by_package[package] = False
+            builds_by_package[package.id] = []
+            needs_build_by_package[package.id] = False
         for build in all_builds:
             if build.status == BuildStatus.FAILEDTOBUILD:
-                builds_by_package[build.source_package_release].append(build)
+                builds_by_package[
+                    build.source_package_release.id].append(build)
             needs_build = build.status in [
                 BuildStatus.NEEDSBUILD,
                 BuildStatus.MANUALDEPWAIT,
                 BuildStatus.CHROOTWAIT,
                 ]
-            needs_build_by_package[build.source_package_release] = needs_build
+            needs_build_by_package[
+                build.source_package_release.id] = needs_build
 
         return (builds_by_package, needs_build_by_package)
 
@@ -3665,8 +3667,8 @@
 
         return [
             SourcePackageReleaseWithStats(
-                package, builds_by_package[package],
-                needs_build_by_package[package])
+                package, builds_by_package[package.id],
+                needs_build_by_package[package.id])
             for package in package_releases]
 
     def _addStatsToPublishings(self, publishings):
@@ -3679,8 +3681,8 @@
 
         return [
             SourcePackagePublishingHistoryWithStats(
-                spph, builds_by_package[spph.sourcepackagerelease],
-                needs_build_by_package[spph.sourcepackagerelease])
+                spph, builds_by_package[spph.sourcepackagerelease.id],
+                needs_build_by_package[spph.sourcepackagerelease.id])
             for spph in filtered_spphs]
 
     def setUpBatch(self, packages):

=== modified file 'lib/lp/registry/model/person.py'
--- lib/lp/registry/model/person.py	2012-11-03 18:05:44 +0000
+++ lib/lp/registry/model/person.py	2012-11-05 11:37:22 +0000
@@ -327,6 +327,7 @@
     Archive,
     validate_ppa,
     )
+from lp.soyuz.model.reporting import LatestPersonSourcepackageReleaseCache
 from lp.soyuz.model.publishing import SourcePackagePublishingHistory
 from lp.soyuz.model.sourcepackagerelease import SourcePackageRelease
 from lp.translations.model.hastranslationimports import (
@@ -2810,8 +2811,8 @@
         return self._latestReleasesQuery(uploader_only=True, ppa_only=True)
 
     def _releasesQueryFilter(self, uploader_only=False, ppa_only=False):
-        """Return the filter used to find sourcepackagereleases (SPRs)
-        related to this person.
+        """Return the filter used to find latest published source package
+        releases (SPRs) related to this person.
 
         :param uploader_only: controls if we are interested in SPRs where
             the person in question is only the uploader (creator) and not the
@@ -2827,24 +2828,27 @@
         'uploader_only' because there shouldn't be any sense of maintainership
         for packages uploaded to PPAs by someone else than the user himself.
         """
-        clauses = [SourcePackageRelease.upload_archive == Archive.id]
-
+        clauses = []
         if uploader_only:
-            clauses.append(SourcePackageRelease.creator == self)
-
+            clauses.append(
+                LatestPersonSourcepackageReleaseCache.creator == self)
         if ppa_only:
             # Source maintainer is irrelevant for PPA uploads.
             pass
         elif uploader_only:
-            clauses.append(SourcePackageRelease.maintainer != self)
+            clauses.append(
+                LatestPersonSourcepackageReleaseCache.maintainer != self)
         else:
-            clauses.append(SourcePackageRelease.maintainer == self)
-
+            clauses.append(
+                LatestPersonSourcepackageReleaseCache.maintainer == self)
         if ppa_only:
-            clauses.append(Archive.purpose == ArchivePurpose.PPA)
+            clauses.append(
+                LatestPersonSourcepackageReleaseCache.archive_purpose ==
+                ArchivePurpose.PPA)
         else:
-            clauses.append(Archive.purpose != ArchivePurpose.PPA)
-
+            clauses.append(
+                LatestPersonSourcepackageReleaseCache.archive_purpose !=
+                ArchivePurpose.PPA)
         return clauses
 
     def _hasReleasesQuery(self, uploader_only=False, ppa_only=False):
@@ -2852,50 +2856,26 @@
         See `_releasesQueryFilter` for details on the criteria used.
         """
         clauses = self._releasesQueryFilter(uploader_only, ppa_only)
-        spph = ClassAlias(SourcePackagePublishingHistory, "spph")
-        tables = (
-            SourcePackageRelease,
-            Join(
-                spph, spph.sourcepackagereleaseID == SourcePackageRelease.id),
-            Join(Archive, Archive.id == spph.archiveID))
-        rs = Store.of(self).using(*tables).find(
-            SourcePackageRelease.id, clauses)
+        rs = Store.of(self).using(LatestPersonSourcepackageReleaseCache).find(
+            LatestPersonSourcepackageReleaseCache.publication_id, clauses)
         return not rs.is_empty()
 
     def _latestReleasesQuery(self, uploader_only=False, ppa_only=False):
-        """Return the sourcepackagereleases (SPRs) related to this person.
+        """Return the sourcepackagereleases records related to this person.
         See `_releasesQueryFilter` for details on the criteria used."""
         clauses = self._releasesQueryFilter(uploader_only, ppa_only)
-        spph = ClassAlias(SourcePackagePublishingHistory, "spph")
         rs = Store.of(self).find(
-            SourcePackageRelease,
-            SourcePackageRelease.id.is_in(
-                Select(
-                    SourcePackageRelease.id,
-                    tables=[
-                        SourcePackageRelease,
-                        Join(
-                            spph,
-                            spph.sourcepackagereleaseID ==
-                            SourcePackageRelease.id),
-                        Join(Archive, Archive.id == spph.archiveID)],
-                    where=And(*clauses),
-                    order_by=[SourcePackageRelease.upload_distroseriesID,
-                              SourcePackageRelease.sourcepackagenameID,
-                              SourcePackageRelease.upload_archiveID,
-                              Desc(SourcePackageRelease.dateuploaded)],
-                    distinct=(
-                        SourcePackageRelease.upload_distroseriesID,
-                        SourcePackageRelease.sourcepackagenameID,
-                        SourcePackageRelease.upload_archiveID)))
-        ).order_by(
-            Desc(SourcePackageRelease.dateuploaded), SourcePackageRelease.id)
+            LatestPersonSourcepackageReleaseCache, *clauses).order_by(
+            Desc(LatestPersonSourcepackageReleaseCache.dateuploaded))
 
         def load_related_objects(rows):
             list(getUtility(IPersonSet).getPrecachedPersonsFromIDs(
-                set(map(attrgetter("maintainerID"), rows))))
-            bulk.load_related(SourcePackageName, rows, ['sourcepackagenameID'])
-            bulk.load_related(Archive, rows, ['upload_archiveID'])
+                set(map(attrgetter("maintainer_id"), rows))))
+            bulk.load_related(
+                SourcePackageName, rows, ['sourcepackagename_id'])
+            bulk.load_related(
+                SourcePackageRelease, rows, ['sourcepackagerelease_id'])
+            bulk.load_related(Archive, rows, ['upload_archive_id'])
 
         return DecoratedResultSet(rs, pre_iter_hook=load_related_objects)
 

=== modified file 'lib/lp/scripts/garbo.py'
--- lib/lp/scripts/garbo.py	2012-10-22 02:30:44 +0000
+++ lib/lp/scripts/garbo.py	2012-11-05 11:37:22 +0000
@@ -8,6 +8,8 @@
     'DailyDatabaseGarbageCollector',
     'FrequentDatabaseGarbageCollector',
     'HourlyDatabaseGarbageCollector',
+    'load_garbo_job_state',
+    'save_garbo_job_state',
     ]
 
 from datetime import (
@@ -17,6 +19,7 @@
 import logging
 import multiprocessing
 import os
+import simplejson
 import threading
 import time
 
@@ -28,11 +31,17 @@
 from psycopg2 import IntegrityError
 import pytz
 from storm.expr import (
+    Alias,
+    And,
+    Desc,
     In,
+    Insert,
+    Join,
     Like,
     Select,
     Update,
     )
+from storm.info import ClassAlias
 from storm.locals import (
     Max,
     Min,
@@ -105,6 +114,10 @@
     )
 from lp.services.session.model import SessionData
 from lp.services.verification.model.logintoken import LoginToken
+from lp.soyuz.model.archive import Archive
+from lp.soyuz.model.publishing import SourcePackagePublishingHistory
+from lp.soyuz.model.reporting import LatestPersonSourcepackageReleaseCache
+from lp.soyuz.model.sourcepackagerelease import SourcePackageRelease
 from lp.translations.interfaces.potemplate import IPOTemplateSet
 from lp.translations.model.potmsgset import POTMsgSet
 from lp.translations.model.potranslation import POTranslation
@@ -120,6 +133,33 @@
 ONE_DAY_IN_SECONDS = 24 * 60 * 60
 
 
+# Garbo jobs may choose to persist state between invocations, if it is likely
+# that not all data can be processed in a single run. These utility methods
+# provide convenient access to that state data.
+def load_garbo_job_state(job_name):
+    # Load the json state data for the given job name.
+    store = getUtility(IStoreSelector).get(MAIN_STORE, MASTER_FLAVOR)
+    job_data = store.execute(
+        "SELECT json_data FROM GarboJobState WHERE name = ?",
+        params=(unicode(job_name),)).get_one()
+    if job_data:
+        return simplejson.loads(job_data[0])
+    return None
+
+
+def save_garbo_job_state(job_name, job_data):
+    # Save the json state data for the given job name.
+    store = getUtility(IStoreSelector).get(MAIN_STORE, MASTER_FLAVOR)
+    json_data = simplejson.dumps(job_data, ensure_ascii=False)
+    result = store.execute(
+        "UPDATE GarboJobState SET json_data = ? WHERE name = ?",
+        params=(json_data, unicode(job_name)))
+    if result.rowcount == 0:
+        store.execute(
+        "INSERT INTO GarboJobState(name, json_data) "
+        "VALUES (?, ?)", params=(unicode(job_name), unicode(json_data)))
+
+
 class BulkPruner(TunableLoop):
     """A abstract ITunableLoop base class for simple pruners.
 
@@ -423,6 +463,159 @@
         transaction.commit()
 
 
+class PopulateLatestPersonSourcepackageReleaseCache(TunableLoop):
+    """Populate the LatestPersonSourcepackageReleaseCache table.
+
+    The LatestPersonSourcepackageReleaseCache contains 2 sets of data, one set
+    for package maintainers and another for package creators. This job first
+    populates the creator data and then does the maintainer data.
+    """
+    maximum_chunk_size = 1000
+
+    def __init__(self, log, abort_time=None):
+        super_cl = super(PopulateLatestPersonSourcepackageReleaseCache, self)
+        super_cl.__init__(log, abort_time)
+        self.store = getUtility(IStoreSelector).get(MAIN_STORE, MASTER_FLAVOR)
+        # Keep a record of the processed source package release id and data
+        # type (creator or maintainer) so we know where to job got up to.
+        self.next_id = 0
+        self.current_person_filter_type = 'creator'
+        self.starting_person_filter_type = self.current_person_filter_type
+        self.job_name = self.__class__.__name__
+        job_data = load_garbo_job_state(self.job_name)
+        if job_data:
+            self.next_id = job_data['next_id']
+            self.current_person_filter_type = job_data['person_filter_type']
+            self.starting_person_filter_type = self.current_person_filter_type
+
+    def getPendingUpdates(self):
+        # Load the latest published source package release data keyed on either
+        # creator or maintainer as required.
+        if self.current_person_filter_type == 'creator':
+            person_filter = SourcePackageRelease.creatorID
+        else:
+            person_filter = SourcePackageRelease.maintainerID
+        spph = ClassAlias(SourcePackagePublishingHistory, "spph")
+        origin = [
+            SourcePackageRelease,
+            Join(
+                spph,
+                And(spph.sourcepackagereleaseID == SourcePackageRelease.id,
+                    spph.archiveID == SourcePackageRelease.upload_archiveID))]
+        spr_select = self.store.using(*origin).find(
+            (SourcePackageRelease.id, Alias(spph.id, 'spph_id')),
+            SourcePackageRelease.id > self.next_id
+        ).order_by(
+            person_filter,
+            SourcePackageRelease.upload_distroseriesID,
+            SourcePackageRelease.sourcepackagenameID,
+            SourcePackageRelease.upload_archiveID,
+            Desc(SourcePackageRelease.dateuploaded),
+            SourcePackageRelease.id
+        ).config(distinct=(
+            person_filter,
+            SourcePackageRelease.upload_distroseriesID,
+            SourcePackageRelease.sourcepackagenameID,
+            SourcePackageRelease.upload_archiveID))._get_select()
+
+        spr = Alias(spr_select, 'spr')
+        origin = [
+            SourcePackageRelease,
+            Join(spr, SQL('spr.id') == SourcePackageRelease.id),
+            Join(Archive, Archive.id == SourcePackageRelease.upload_archiveID)]
+        rs = self.store.using(*origin).find(
+            (SourcePackageRelease.id,
+            person_filter,
+            SourcePackageRelease.upload_archiveID,
+            Archive.purpose,
+            SourcePackageRelease.upload_distroseriesID,
+            SourcePackageRelease.sourcepackagenameID,
+            SourcePackageRelease.dateuploaded, SQL('spph_id'))
+        ).order_by(SourcePackageRelease.id)
+        return rs
+
+    def isDone(self):
+        # If there is no more data to process for creators, switch over to
+        # processing data for maintainers, or visa versa.
+        current_count = self.getPendingUpdates().count()
+        if current_count == 0:
+            if (self.current_person_filter_type !=
+                self.starting_person_filter_type):
+                return True
+            if  self.current_person_filter_type == 'creator':
+                self.current_person_filter_type = 'maintainer'
+            else:
+                self.current_person_filter_type = 'creator'
+            self.next_id = 0
+            current_count = self.getPendingUpdates().count()
+        return current_count == 0
+
+    def update_cache(self, updates):
+        # Update the LatestPersonSourcepackageReleaseCache table. Records for
+        # each creator/maintainer will either be new inserts or updates. We try
+        # to update first, and gather data for missing (new) records along the
+        # way. At the end, a bulk insert is done for any new data.
+        # Updates is a list of data records (tuples of values).
+        # Each record is keyed on:
+        # - (creator/maintainer), archive, distroseries, sourcepackagename
+        inserts = []
+        columns = (
+            LatestPersonSourcepackageReleaseCache.sourcepackagerelease_id,
+            LatestPersonSourcepackageReleaseCache.creator_id,
+            LatestPersonSourcepackageReleaseCache.maintainer_id,
+            LatestPersonSourcepackageReleaseCache.upload_archive_id,
+            LatestPersonSourcepackageReleaseCache.archive_purpose,
+            LatestPersonSourcepackageReleaseCache.upload_distroseries_id,
+            LatestPersonSourcepackageReleaseCache.sourcepackagename_id,
+            LatestPersonSourcepackageReleaseCache.dateuploaded,
+            LatestPersonSourcepackageReleaseCache.publication_id,
+        )
+        for update in updates:
+            (spr_id, person_id, archive_id, purpose,
+             distroseries_id, spn_id, dateuploaded, spph_id) = update
+            if self.current_person_filter_type == 'creator':
+                creator_id = person_id
+                maintainer_id = None
+            else:
+                creator_id = None
+                maintainer_id = person_id
+            values = (
+                spr_id, creator_id, maintainer_id, archive_id, purpose.value,
+                distroseries_id, spn_id, dateuploaded, spph_id)
+            data = dict(zip(columns, values))
+            result = self.store.execute(Update(
+                data, And(
+                LatestPersonSourcepackageReleaseCache.upload_archive_id ==
+                    archive_id,
+                LatestPersonSourcepackageReleaseCache.upload_distroseries_id ==
+                    distroseries_id,
+                LatestPersonSourcepackageReleaseCache.sourcepackagename_id ==
+                    spn_id,
+                LatestPersonSourcepackageReleaseCache.creator_id ==
+                    creator_id,
+                LatestPersonSourcepackageReleaseCache.maintainer_id ==
+                    maintainer_id)))
+            if result.rowcount == 0:
+                inserts.append(values)
+        if inserts:
+            self.store.execute(Insert(columns, values=inserts))
+
+    def __call__(self, chunk_size):
+        max_id = self.next_id
+        updates = []
+        for update in (self.getPendingUpdates()[:chunk_size]):
+            updates.append(update)
+            max_id = update[0]
+        self.update_cache(updates)
+
+        self.next_id = max_id
+        self.store.flush()
+        save_garbo_job_state(self.job_name, {
+            'next_id': max_id,
+            'person_filter_type': self.current_person_filter_type})
+        transaction.commit()
+
+
 class OpenIDConsumerNoncePruner(TunableLoop):
     """An ITunableLoop to prune old OpenIDConsumerNonce records.
 
@@ -1339,6 +1532,7 @@
         OpenIDConsumerAssociationPruner,
         AntiqueSessionPruner,
         VoucherRedeemer,
+        PopulateLatestPersonSourcepackageReleaseCache,
         ]
     experimental_tunable_loops = []
 

=== modified file 'lib/lp/scripts/tests/test_garbo.py'
--- lib/lp/scripts/tests/test_garbo.py	2012-10-22 02:30:44 +0000
+++ lib/lp/scripts/tests/test_garbo.py	2012-11-05 11:37:22 +0000
@@ -11,6 +11,7 @@
     timedelta,
     )
 import logging
+import pytz
 from StringIO import StringIO
 import time
 
@@ -69,8 +70,10 @@
     DuplicateSessionPruner,
     FrequentDatabaseGarbageCollector,
     HourlyDatabaseGarbageCollector,
+    load_garbo_job_state,
     LoginTokenPruner,
     OpenIDConsumerAssociationPruner,
+    save_garbo_job_state,
     UnusedSessionPruner,
     )
 from lp.services.config import config
@@ -109,6 +112,8 @@
 from lp.services.verification.interfaces.authtoken import LoginTokenType
 from lp.services.verification.model.logintoken import LoginToken
 from lp.services.worlddata.interfaces.language import ILanguageSet
+from lp.soyuz.enums import PackagePublishingStatus
+from lp.soyuz.model.reporting import LatestPersonSourcepackageReleaseCache
 from lp.testing import (
     FakeAdapterMixin,
     person_logged_in,
@@ -423,6 +428,15 @@
         collector.main()
         return collector
 
+    def test_persist_garbo_state(self):
+        # Test that loading and saving garbo job state works.
+        save_garbo_job_state('job', {'data': 1})
+        data = load_garbo_job_state('job')
+        self.assertEqual({'data': 1}, data)
+        save_garbo_job_state('job', {'data': 2})
+        data = load_garbo_job_state('job')
+        self.assertEqual({'data': 2}, data)
+
     def test_OAuthNoncePruner(self):
         now = datetime.now(UTC)
         timestamps = [
@@ -1112,6 +1126,103 @@
         self.assertEqual(0, store.find(Product,
             Product._information_type == None).count())
 
+    def test_PopulateLatestPersonSourcepackageReleaseCache(self):
+        switch_dbuser('testadmin')
+        # Make some same test data - we create published source package
+        # releases for 2 different creators and maintainers.
+        creators = []
+        for _ in range(2):
+            creators.append(self.factory.makePerson())
+        maintainers = []
+        for _ in range(2):
+            maintainers.append(self.factory.makePerson())
+
+        spn = self.factory.makeSourcePackageName()
+        distroseries = self.factory.makeDistroSeries()
+        spr1 = self.factory.makeSourcePackageRelease(
+            creator=creators[0], maintainer=maintainers[0],
+            distroseries=distroseries, sourcepackagename=spn,
+            date_uploaded=datetime(2010, 12, 1, tzinfo=pytz.UTC))
+        self.factory.makeSourcePackagePublishingHistory(
+            status=PackagePublishingStatus.PUBLISHED,
+            sourcepackagerelease=spr1)
+        spr2 = self.factory.makeSourcePackageRelease(
+            creator=creators[0], maintainer=maintainers[1],
+            distroseries=distroseries, sourcepackagename=spn,
+            date_uploaded=datetime(2010, 12, 2, tzinfo=pytz.UTC))
+        self.factory.makeSourcePackagePublishingHistory(
+            status=PackagePublishingStatus.PUBLISHED,
+            sourcepackagerelease=spr2)
+        spr3 = self.factory.makeSourcePackageRelease(
+            creator=creators[1], maintainer=maintainers[0],
+            distroseries=distroseries, sourcepackagename=spn,
+            date_uploaded=datetime(2010, 12, 3, tzinfo=pytz.UTC))
+        self.factory.makeSourcePackagePublishingHistory(
+            status=PackagePublishingStatus.PUBLISHED,
+            sourcepackagerelease=spr3)
+        spr4 = self.factory.makeSourcePackageRelease(
+            creator=creators[1], maintainer=maintainers[1],
+            distroseries=distroseries, sourcepackagename=spn,
+            date_uploaded=datetime(2010, 12, 4, tzinfo=pytz.UTC))
+        self.factory.makeSourcePackagePublishingHistory(
+            status=PackagePublishingStatus.PUBLISHED,
+            sourcepackagerelease=spr4)
+
+        transaction.commit()
+        self.runFrequently()
+
+        store = IMasterStore(LatestPersonSourcepackageReleaseCache)
+        # Check that the garbo state table has data.
+        self.assertIsNotNone(
+            store.execute(
+                'SELECT * FROM GarboJobState WHERE name=?',
+                params=[u'PopulateLatestPersonSourcepackageReleaseCache']
+            ).get_one())
+
+        def _assert_release_by_creator(creator, spr):
+            release_records = store.find(
+                LatestPersonSourcepackageReleaseCache,
+                LatestPersonSourcepackageReleaseCache.creator == creator)
+            [record] = list(release_records)
+            self.assertEqual(spr.creator, record.creator)
+            self.assertIsNone(record.maintainer)
+            self.assertEqual(
+                spr.dateuploaded, pytz.UTC.localize(record.dateuploaded))
+
+        def _assert_release_by_maintainer(maintainer, spr):
+            release_records = store.find(
+                LatestPersonSourcepackageReleaseCache,
+                LatestPersonSourcepackageReleaseCache.maintainer == maintainer)
+            [record] = list(release_records)
+            self.assertEqual(spr.maintainer, record.maintainer)
+            self.assertIsNone(record.creator)
+            self.assertEqual(
+                spr.dateuploaded, pytz.UTC.localize(record.dateuploaded))
+
+        _assert_release_by_creator(creators[0], spr2)
+        _assert_release_by_creator(creators[1], spr4)
+        _assert_release_by_maintainer(maintainers[0], spr3)
+        _assert_release_by_maintainer(maintainers[1], spr4)
+
+        # Create a newer published source package release and ensure the
+        # release cache table is correctly updated.
+        switch_dbuser('testadmin')
+        spr5 = self.factory.makeSourcePackageRelease(
+            creator=creators[1], maintainer=maintainers[1],
+            distroseries=distroseries, sourcepackagename=spn,
+            date_uploaded=datetime(2010, 12, 5, tzinfo=pytz.UTC))
+        self.factory.makeSourcePackagePublishingHistory(
+            status=PackagePublishingStatus.PUBLISHED,
+            sourcepackagerelease=spr5)
+
+        transaction.commit()
+        self.runFrequently()
+
+        _assert_release_by_creator(creators[0], spr2)
+        _assert_release_by_creator(creators[1], spr5)
+        _assert_release_by_maintainer(maintainers[0], spr3)
+        _assert_release_by_maintainer(maintainers[1], spr5)
+
 
 class TestGarboTasks(TestCaseWithFactory):
     layer = LaunchpadZopelessLayer

=== modified file 'lib/lp/soyuz/configure.zcml'
--- lib/lp/soyuz/configure.zcml	2012-09-28 14:48:20 +0000
+++ lib/lp/soyuz/configure.zcml	2012-11-05 11:37:22 +0000
@@ -1002,6 +1002,12 @@
       <allow interface="lp.soyuz.adapters.overrides.IOverridePolicy" />
     </class>
 
+    <class
+        class="lp.soyuz.model.reporting.LatestPersonSourcepackageReleaseCache">
+        <allow
+            interface="lp.soyuz.interfaces.reporting.ILatestPersonSourcepackageReleaseCache"/>
+    </class>
+
     <!-- ProcessAcceptedBugsJobSource -->
     <securedutility
 	component=".model.processacceptedbugsjob.ProcessAcceptedBugsJob"

=== added file 'lib/lp/soyuz/interfaces/reporting.py'
--- lib/lp/soyuz/interfaces/reporting.py	1970-01-01 00:00:00 +0000
+++ lib/lp/soyuz/interfaces/reporting.py	2012-11-05 11:37:22 +0000
@@ -0,0 +1,30 @@
+# Copyright 2012 Canonical Ltd.  This software is licensed under the
+# GNU Affero General Public License version 3 (see the file LICENSE).
+
+__metaclass__ = type
+__all__ = [
+    'ILatestPersonSourcepackageReleaseCache',
+    ]
+
+
+from zope.interface import Attribute
+from lp.soyuz.interfaces.sourcepackagerelease import ISourcePackageRelease
+
+
+class ILatestPersonSourcepackageReleaseCache(ISourcePackageRelease):
+    """Published source package release information for a person.
+
+    The records represented by this object are the latest published source
+    package releases for a given sourcepackage, distroseries, archive, keyed
+    on the package creator and maintainer. The table contains a set of data
+    records for package creators and a second set of data for package
+    maintainers. Queries can be filtered by creator or maintainer as required.
+    """
+
+    id = Attribute(
+        "The id of the associated LatestPersonSourcepackageReleaseCache"
+        "record.")
+    sourcepackagerelease = Attribute(
+        "The SourcePackageRelease which this object represents.")
+    publication = Attribute(
+        "The publication record for the associated SourcePackageRelease.")

=== added file 'lib/lp/soyuz/model/reporting.py'
--- lib/lp/soyuz/model/reporting.py	1970-01-01 00:00:00 +0000
+++ lib/lp/soyuz/model/reporting.py	2012-11-05 11:37:22 +0000
@@ -0,0 +1,51 @@
+# Copyright 2012 Canonical Ltd.  This software is licensed under the
+# GNU Affero General Public License version 3 (see the file LICENSE).
+
+__metaclass__ = type
+__all__ = [
+    'LatestPersonSourcepackageReleaseCache',
+    ]
+
+from lazr.delegates import delegates
+from storm.base import Storm
+from storm.locals import (
+    Int,
+    Reference,
+    )
+from storm.properties import DateTime
+from zope.interface import implements
+
+from lp.services.database.enumcol import EnumCol
+from lp.soyuz.enums import ArchivePurpose
+from lp.soyuz.interfaces.reporting import (
+    ILatestPersonSourcepackageReleaseCache,
+    )
+from lp.soyuz.interfaces.sourcepackagerelease import ISourcePackageRelease
+
+
+class LatestPersonSourcepackageReleaseCache(Storm):
+    """See `LatestPersonSourcepackageReleaseCache`."""
+    implements(ILatestPersonSourcepackageReleaseCache)
+    delegates(ISourcePackageRelease, context='sourcepackagerelease')
+
+    __storm_table__ = 'LatestPersonSourcepackageReleaseCache'
+
+    id = Int(name='id', primary=True)
+    publication_id = Int(name='publication')
+    publication = Reference(
+        publication_id, 'SourcePackagePublishingHistory.id')
+    dateuploaded = DateTime(name='date_uploaded')
+    creator_id = Int(name='creator')
+    creator = Reference(creator_id, 'Person.id')
+    maintainer_id = Int(name='maintainer')
+    maintainer = Reference(maintainer_id, 'Person.id')
+    upload_archive_id = Int(name='upload_archive')
+    upload_archive = Reference(upload_archive_id, 'Archive.id')
+    archive_purpose = EnumCol(schema=ArchivePurpose)
+    upload_distroseries_id = Int(name='upload_distroseries')
+    upload_distroseries = Reference(upload_distroseries_id, 'DistroSeries.id')
+    sourcepackagename_id = Int(name='sourcepackagename')
+    sourcepackagename = Reference(sourcepackagename_id, 'SourcePackageName.id')
+    sourcepackagerelease_id = Int(name='sourcepackagerelease')
+    sourcepackagerelease = Reference(
+        sourcepackagerelease_id, 'SourcePackageRelease.id')


Follow ups