yahoo-eng-team team mailing list archive
-
yahoo-eng-team team
-
Mailing list archive
-
Message #48277
[Bug 1558827] Re: port filter hook for network tenant id matching breaks counting
Reviewed: https://review.openstack.org/294321
Committed: https://git.openstack.org/cgit/openstack/neutron/commit/?id=ff4067af5ba52cc205f38d12cdf68bd454445ced
Submitter: Jenkins
Branch: master
commit ff4067af5ba52cc205f38d12cdf68bd454445ced
Author: Kevin Benton <kevin@xxxxxxxxxx>
Date: Wed Mar 16 12:28:49 2016 -0700
Outerjoin to networks for port ownership filter
Change I55328cb43207654b9bb4cfb732923982d020ab0a
added a port filter to compare tenant ID to the
network owner as well. This caused the networks
table to be added to the FROM statement since
ports wasn't joined to networks for any other
reason. This resulted in an explosion of records
returned (networks * ports). SQLAlchemy would
de-dup this for us when iterating over results;
however, it would completely break the 'count()'
operation required by get_ports_count (which
the quota engine uses).
Change-Id: I5b780121ba408fba691fff9304d4a22e5892b85f
Closes-Bug: #1558827
** Changed in: neutron
Status: In Progress => Fix Released
--
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to neutron.
https://bugs.launchpad.net/bugs/1558827
Title:
port filter hook for network tenant id matching breaks counting
Status in neutron:
Fix Released
Bug description:
The filter hook added in https://review.openstack.org/#/c/255285
causes SQLAlchemy to add the networks table to the FROM statement
without a restricted join condition. This results in many duplicate
rows coming back from the DB query. This is okay for normal record
retrieval because sqlalchemy would deduplicate the records. However,
when calling .count() on the query, it returns a number far too large.
This breaks the quota engine for plugins that don't use the newer
method of tracking resources.
To manage notifications about this bug go to:
https://bugs.launchpad.net/neutron/+bug/1558827/+subscriptions
References