← Back to team overview

cf-charmers team mailing list archive

[Merge] lp:~cf-charmers/charm-helpers/cloud-foundry into lp:charm-helpers

 

Cory Johns has proposed merging lp:~cf-charmers/charm-helpers/cloud-foundry into lp:charm-helpers.

Requested reviews:
  Charm Helper Maintainers (charm-helpers)

For more details, see:
https://code.launchpad.net/~cf-charmers/charm-helpers/cloud-foundry/+merge/222992

Changes from Cloud Foundry, most notably the Services framework for abstracting the common use-case of collecting information from various sources to configure and manage services.  See the docstrings in the merge.
-- 
https://code.launchpad.net/~cf-charmers/charm-helpers/cloud-foundry/+merge/222992
Your team Cloud Foundry Charmers is subscribed to branch lp:~cf-charmers/charm-helpers/cloud-foundry.
=== modified file '.bzrignore'
--- .bzrignore	2013-05-22 08:12:53 +0000
+++ .bzrignore	2014-06-12 20:35:48 +0000
@@ -4,5 +4,10 @@
 MANIFEST
 charmhelpers/version.py
 .coverage
-.env/
 coverage.xml
+.venv
+cover/
+.ropeproject/
+charmhelpers.egg-info/
+.DS_Store
+.tox/
\ No newline at end of file

=== modified file 'Makefile'
--- Makefile	2013-07-09 09:47:31 +0000
+++ Makefile	2014-06-12 20:35:48 +0000
@@ -9,6 +9,7 @@
 	@echo "make deb - Create debian package"
 	@echo "make clean"
 	@echo "make userinstall - Install locally"
+	@echo "make test - Run unit tests"
 
 sdeb: source
 	scripts/build source
@@ -25,22 +26,31 @@
 	rm -rf build/ MANIFEST
 	find . -name '*.pyc' -delete
 	rm -rf dist/*
+	rm -rf .tox
 	dh_clean
 
 userinstall:
 	scripts/update-revno
 	python setup.py install --user
 
-test:
-	@echo Starting tests...
-	@$(PYTHON) /usr/bin/nosetests --nologcapture tests/
+test: 
+	@echo Starting tests...
+	@bash -c 'if [ ! -e /usr/bin/tox ]; then sudo apt-get install python-tox;fi'
+	@tox 
+
+
+coverage: 
+	@echo Starting tests...
+	@.tox/py27/bin/nosetests --nologcapture --with-coverage
+
 
 ftest:
 	@echo Starting fast tests...
-	@$(PYTHON) /usr/bin/nosetests --attr '!slow' --nologcapture tests/
+	@./bin/nosetests --attr '!slow' --nologcapture 
 
 lint:
 	@echo Checking for Python syntax...
 	@flake8 --ignore=E123,E501 $(PROJECT) $(TESTS) && echo OK
 
+
 build: test lint

=== removed file 'README.test'
--- README.test	2014-05-10 19:40:21 +0000
+++ README.test	1970-01-01 00:00:00 +0000
@@ -1,11 +0,0 @@
-Required Packages for Running Tests
------------------------------------
-python-shelltoolbox
-python-tempita
-python-nose
-python-mock
-python-testtools
-python-jinja2
-python-coverage
-python-netifaces
-python-netaddr

=== modified file 'README.txt'
--- README.txt	2013-05-11 20:24:29 +0000
+++ README.txt	2014-06-12 20:35:48 +0000
@@ -6,3 +6,23 @@
 charms that work together. In addition to basic tasks like interact-
 ing with the charm environment and the machine it runs on, it also
 helps keep you build hooks and establish relations effortlessly.
+
+Dev setup
+---
+
+Install deps, create a virtualenv, install charmhelpers:
+
+sudo apt-get install python-dev python-virtualenv libapt-pkg-dev zip
+virtualenv chdev && . chdev/bin/activate
+pip install -e bzr+lp:~cf-charmers/charm-helpers/cloud-foundry#egg=charmhelpers
+
+Run tests
+---
+
+cd chdev/src/charmhelpers
+make test
+
+
+
+
+

=== added directory 'charmhelpers/chsync'
=== added file 'charmhelpers/chsync/README'
--- charmhelpers/chsync/README	1970-01-01 00:00:00 +0000
+++ charmhelpers/chsync/README	2014-06-12 20:35:48 +0000
@@ -0,0 +1,114 @@
+Script for synchronizing charm-helpers into a charm branch.
+
+This script is intended to be used by charm authors during the development
+of their charm.  It allows authors to pull in bits of a charm-helpers source
+tree and embed directly into their charm, to be deployed with the rest of
+their hooks and charm payload.  This script is not intended to be called
+by the hooks themselves, but instead by the charm author while they are
+hacking on a charm offline.  Consider it a method of compiling specific
+revision of a charm-helpers branch into a given charm source tree.
+
+Some goals and benefits to using a sync tool to manage this process:
+
+    - Reduces the burden of manually copying in upstream charm helpers code
+      into a charm and helps ensure we can easily keep a specific charm's
+      helper code up to date.
+
+    - Allows authors to hack on their own working branch of charm-helpers,
+      easily sync into their WIP charm.  Any changes they've made to charm
+      helpers can be upstreamed via a merge of their charm-helpers branch
+      into lp:charm-helpers, ideally at the same time they are upstreaming
+      the charm itself into the charm store.  Separating charm helper
+      development from charm development can help reduce cases where charms
+      are shipping locally modified helpers.
+
+    - Avoids the need to ship the *entire* lp:charm-helpers source tree with
+      a charm.  Authors can selectively pick and choose what subset of helpers
+      to include to satisfy the goals of their charm.
+
+Loosely based on OpenStack's oslo-incubator:
+
+    https://github.com/openstack/oslo-incubator.git
+
+Allows specifying a list of dependencies to sync in from a charm-helpers
+branch.  Ideally, each charm should describe its requirements in a yaml
+config included in the charm, eg charm-helpers.yaml (NOTE: Example module
+layout as of 05/30/2013):
+
+    $ cd my-charm
+    $ cat >charm-helpers.yaml <<END
+    destination: hooks/helpers
+    branch: lp:charm-helpers
+    include:
+        - core
+        - contrib.openstack
+        - contrib.hahelpers:
+            - ceph_utils
+    END
+
+includes may be defined as entire module sub-directories, or as invidual
+.py files with in a module sub-directory.
+
+Charm author can then sync in and update helpers as needed.  The following
+import all of charmhelpers.core + charmhelpers.contrib.openstack, and only
+ceph_utils.py from charmhelpers.contrib.hahelpers:
+
+    $ charm-helper-sync -c charm-helpers.yaml
+    $ find hooks/helpers/
+    hooks/helpers/
+    hooks/helpers/contrib
+    hooks/helpers/contrib/openstack
+    hooks/helpers/contrib/openstack/openstack_utils.py
+    hooks/helpers/contrib/openstack/__init__.py
+    hooks/helpers/contrib/hahelpers
+    hooks/helpers/contrib/hahelpers/ceph_utils.py
+    hooks/helpers/contrib/hahelpers/__init__.py
+    hooks/helpers/contrib/__init__.py
+    hooks/helpers/core
+    hooks/helpers/core/hookenv.py
+    hooks/helpers/core/host.py
+    hooks/helpers/core/__init__.py
+    hooks/helpers/__init__.py
+
+
+Script will create missing __init__.py's to ensure each subdirectory is
+importable, assuming the script is run from the charm's top-level directory.
+
+By default, only directories that look like python modules and associated
+.py source files will be synced.  If you need to include other files in
+the sync (for example, template files), you can add include hints to
+your config.  This can be done either on a per-module basis using standard
+UNIX filename patterns, eg:
+
+    destination: hooks/helpers
+    branch: lp:charm-helpers
+    include:
+        - core|inc=* # include all extra files from this module.
+        - contrib.openstack|inc=*.template # include only .template's
+        - contrib.hahelpers:
+            - ceph_utils|inc=*.cfg # include .cfg files
+
+Or globally for all included assets:
+
+    destination: hooks/helpers
+    branch: lp:charm-helpers
+    options: inc=*.template,inc=*.cfg # include .templates and .cfgs globally
+    include:
+        - core
+        - contrib.openstack
+        - contrib.hahelpers:
+            - ceph_utils
+
+You may also override configured destination directory and source bzr
+branch:
+
+    $ charm-helper-sync -b ~/src/bzr/charm-helpers-dev \
+            -d hooks/helpers-test \
+            -c charm-helpers.yaml
+
+Or not use a config file at all:
+    $ charm-helper-sync -b lp:~gandelman-a/charm-helpers/fixes \
+            -d hooks/helpers core contrib.openstack contrib.hahelpers
+
+Script will create missing __init__.py's to ensure each subdirectory is
+importable, assuming the script is run from the charm's top-level directory.

=== added file 'charmhelpers/chsync/__init__.py'
=== added file 'charmhelpers/chsync/charm_helpers_sync.py'
--- charmhelpers/chsync/charm_helpers_sync.py	1970-01-01 00:00:00 +0000
+++ charmhelpers/chsync/charm_helpers_sync.py	2014-06-12 20:35:48 +0000
@@ -0,0 +1,235 @@
+#!/usr/bin/python
+#
+# Copyright 2013 Canonical Ltd.
+
+# Authors:
+#   Adam Gandelman <adamg@xxxxxxxxxx>
+#
+
+import logging
+import optparse
+import os
+import subprocess
+import shutil
+import sys
+import tempfile
+import yaml
+
+from fnmatch import fnmatch
+
+CHARM_HELPERS_BRANCH = 'lp:charm-helpers'
+
+
+def parse_config(conf_file):
+    if not os.path.isfile(conf_file):
+        logging.error('Invalid config file: %s.' % conf_file)
+        return False
+    return yaml.load(open(conf_file).read())
+
+
+def clone_helpers(work_dir, branch):
+    dest = os.path.join(work_dir, 'charm-helpers')
+    logging.info('Checking out %s to %s.' % (branch, dest))
+    cmd = ['bzr', 'branch', branch, dest]
+    subprocess.check_call(cmd)
+    return dest
+
+
+def _module_path(module):
+    return os.path.join(*module.split('.'))
+
+
+def _src_path(src, module):
+    return os.path.join(src, 'charmhelpers', _module_path(module))
+
+
+def _dest_path(dest, module):
+    return os.path.join(dest, _module_path(module))
+
+
+def _is_pyfile(path):
+    return os.path.isfile(path + '.py')
+
+
+def ensure_init(path):
+    '''
+    ensure directories leading up to path are importable, omitting
+    parent directory, eg path='/hooks/helpers/foo'/:
+        hooks/
+        hooks/helpers/__init__.py
+        hooks/helpers/foo/__init__.py
+    '''
+    for d, dirs, files in os.walk(os.path.join(*path.split('/')[:2])):
+        _i = os.path.join(d, '__init__.py')
+        if not os.path.exists(_i):
+            logging.info('Adding missing __init__.py: %s' % _i)
+            open(_i, 'wb').close()
+
+
+def sync_pyfile(src, dest):
+    src = src + '.py'
+    src_dir = os.path.dirname(src)
+    logging.info('Syncing pyfile: %s -> %s.' % (src, dest))
+    if not os.path.exists(dest):
+        os.makedirs(dest)
+    shutil.copy(src, dest)
+    if os.path.isfile(os.path.join(src_dir, '__init__.py')):
+        shutil.copy(os.path.join(src_dir, '__init__.py'),
+                    dest)
+    ensure_init(dest)
+
+
+def get_filter(opts=None):
+    opts = opts or []
+    if 'inc=*' in opts:
+        # do not filter any files, include everything
+        return None
+
+    def _filter(dir, ls):
+        incs = [opt.split('=').pop() for opt in opts if 'inc=' in opt]
+        _filter = []
+        for f in ls:
+            _f = os.path.join(dir, f)
+
+            if not os.path.isdir(_f) and not _f.endswith('.py') and incs:
+                if True not in [fnmatch(_f, inc) for inc in incs]:
+                    logging.debug('Not syncing %s, does not match include '
+                                  'filters (%s)' % (_f, incs))
+                    _filter.append(f)
+                else:
+                    logging.debug('Including file, which matches include '
+                                  'filters (%s): %s' % (incs, _f))
+            elif (os.path.isfile(_f) and not _f.endswith('.py')):
+                logging.debug('Not syncing file: %s' % f)
+                _filter.append(f)
+            elif (os.path.isdir(_f) and not
+                  os.path.isfile(os.path.join(_f, '__init__.py'))):
+                logging.debug('Not syncing directory: %s' % f)
+                _filter.append(f)
+        return _filter
+    return _filter
+
+
+def sync_directory(src, dest, opts=None):
+    if os.path.exists(dest):
+        logging.debug('Removing existing directory: %s' % dest)
+        shutil.rmtree(dest)
+    logging.info('Syncing directory: %s -> %s.' % (src, dest))
+
+    shutil.copytree(src, dest, ignore=get_filter(opts))
+    ensure_init(dest)
+
+
+def sync(src, dest, module, opts=None):
+    if os.path.isdir(_src_path(src, module)):
+        sync_directory(_src_path(src, module), _dest_path(dest, module), opts)
+    elif _is_pyfile(_src_path(src, module)):
+        sync_pyfile(_src_path(src, module),
+                    os.path.dirname(_dest_path(dest, module)))
+    else:
+        logging.warn('Could not sync: %s. Neither a pyfile or directory, '
+                     'does it even exist?' % module)
+
+
+def parse_sync_options(options):
+    if not options:
+        return []
+    return options.split(',')
+
+
+def extract_options(inc, global_options=None):
+    global_options = global_options or []
+    if global_options and isinstance(global_options, basestring):
+        global_options = [global_options]
+    if '|' not in inc:
+        return (inc, global_options)
+    inc, opts = inc.split('|')
+    return (inc, parse_sync_options(opts) + global_options)
+
+
+def sync_helpers(include, src, dest, options=None):
+    if not os.path.isdir(dest):
+        os.mkdir(dest)
+
+    global_options = parse_sync_options(options)
+
+    for inc in include:
+        if isinstance(inc, str):
+            inc, opts = extract_options(inc, global_options)
+            sync(src, dest, inc, opts)
+        elif isinstance(inc, dict):
+            # could also do nested dicts here.
+            for k, v in inc.iteritems():
+                if isinstance(v, list):
+                    for m in v:
+                        inc, opts = extract_options(m, global_options)
+                        sync(src, dest, '%s.%s' % (k, inc), opts)
+
+                        
+def main():
+    parser = optparse.OptionParser()
+    parser.add_option('-c', '--config', action='store', dest='config',
+                      default=None, help='helper config file')
+    parser.add_option('-D', '--debug', action='store_true', dest='debug',
+                      default=False, help='debug')
+    parser.add_option('-b', '--branch', action='store', dest='branch',
+                      help='charm-helpers bzr branch (overrides config)')
+    parser.add_option('-d', '--destination', action='store', dest='dest_dir',
+                      help='sync destination dir (overrides config)')
+    (opts, args) = parser.parse_args()
+
+    if opts.debug:
+        logging.basicConfig(level=logging.DEBUG)
+    else:
+        logging.basicConfig(level=logging.INFO)
+
+    if opts.config:
+        logging.info('Loading charm helper config from %s.' % opts.config)
+        config = parse_config(opts.config)
+        if not config:
+            logging.error('Could not parse config from %s.' % opts.config)
+            sys.exit(1)
+    else:
+        config = {}
+
+    if 'branch' not in config:
+        config['branch'] = CHARM_HELPERS_BRANCH
+    if opts.branch:
+        config['branch'] = opts.branch
+    if opts.dest_dir:
+        config['destination'] = opts.dest_dir
+
+    if 'destination' not in config:
+        logging.error('No destination dir. specified as option or config.')
+        sys.exit(1)
+
+    if 'include' not in config:
+        if not args:
+            logging.error('No modules to sync specified as option or config.')
+            sys.exit(1)
+        config['include'] = []
+        [config['include'].append(a) for a in args]
+
+    sync_options = None
+    if 'options' in config:
+        sync_options = config['options']
+    tmpd = tempfile.mkdtemp()
+    try:
+        checkout = clone_helpers(tmpd, config['branch'])
+        sync_helpers(config['include'], checkout, config['destination'],
+                     options=sync_options)
+    except Exception, e:
+        logging.error("Could not sync: %s" % e)
+        raise e
+    finally:
+        logging.debug('Cleaning up %s' % tmpd)
+        shutil.rmtree(tmpd)
+
+                                
+if __name__ == '__main__':
+    main()
+
+
+
+
+

=== added file 'charmhelpers/chsync/example-config.yaml'
--- charmhelpers/chsync/example-config.yaml	1970-01-01 00:00:00 +0000
+++ charmhelpers/chsync/example-config.yaml	2014-06-12 20:35:48 +0000
@@ -0,0 +1,14 @@
+# Import from remote bzr branch.
+branch: lp:charm-helpers
+# install helpers to ./hooks/charmhelpers
+destination: hooks/charmhelpers
+include:
+    # include all of charmhelpers.core
+    - core
+    # all of charmhelpers.payload
+    - payload
+    # and a subset of charmhelpers.contrib.hahelpers
+    - contrib.hahelpers:
+        - openstack_common
+        - ceph_utils
+        - utils

=== added directory 'charmhelpers/contrib/cloudfoundry'
=== added file 'charmhelpers/contrib/cloudfoundry/__init__.py'
=== added file 'charmhelpers/contrib/cloudfoundry/common.py'
--- charmhelpers/contrib/cloudfoundry/common.py	1970-01-01 00:00:00 +0000
+++ charmhelpers/contrib/cloudfoundry/common.py	2014-06-12 20:35:48 +0000
@@ -0,0 +1,12 @@
+from charmhelpers.core import host
+
+from charmhelpers.fetch import (
+    apt_install, apt_update, add_source, filter_installed_packages
+)
+
+
+def prepare_cloudfoundry_environment(config_data, packages):
+    add_source(config_data['source'], config_data.get('key'))
+    apt_update(fatal=True)
+    apt_install(packages=filter_installed_packages(packages), fatal=True)
+    host.adduser('vcap')

=== added file 'charmhelpers/contrib/cloudfoundry/contexts.py'
--- charmhelpers/contrib/cloudfoundry/contexts.py	1970-01-01 00:00:00 +0000
+++ charmhelpers/contrib/cloudfoundry/contexts.py	2014-06-12 20:35:48 +0000
@@ -0,0 +1,75 @@
+import os
+import yaml
+
+from charmhelpers.core.services import RelationContext
+
+
+class StoredContext(dict):
+    """
+    A data context that always returns the data that it was first created with.
+    """
+    def __init__(self, file_name, config_data):
+        """
+        If the file exists, populate `self` with the data from the file.
+        Otherwise, populate with the given data and persist it to the file.
+        """
+        if os.path.exists(file_name):
+            self.update(self.read_context(file_name))
+        else:
+            self.store_context(file_name, config_data)
+            self.update(config_data)
+
+    def store_context(self, file_name, config_data):
+        with open(file_name, 'w') as file_stream:
+            yaml.dump(config_data, file_stream)
+
+    def read_context(self, file_name):
+        with open(file_name, 'r') as file_stream:
+            data = yaml.load(file_stream)
+            if not data:
+                raise OSError("%s is empty" % file_name)
+            return data
+
+
+class NatsRelation(RelationContext):
+    interface = 'nats'
+    required_keys = ['address', 'port', 'user', 'password']
+
+
+class MysqlRelation(RelationContext):
+    interface = 'db'
+    required_keys = ['user', 'password', 'host', 'database']
+    dsn_template = "mysql2://{user}:{password}@{host}:{port}/{database}"
+
+    def get_data(self):
+        RelationContext.get_data(self)
+        if self.is_ready():
+            for unit in self['db']:
+                if 'port' not in unit:
+                    unit['port'] = '3306'
+                unit['dsn'] = self.dsn_template.format(**unit)
+
+
+class RouterRelation(RelationContext):
+    interface = 'router'
+    required_keys = ['domain']
+
+
+class LogRouterRelation(RelationContext):
+    interface = 'logrouter'
+    required_keys = ['shared_secret', 'address', 'incoming_port', 'outgoing_port']
+
+
+class LoggregatorRelation(RelationContext):
+    interface = 'loggregator'
+    required_keys = ['address', 'incoming_port', 'outgoing_port']
+
+
+class EtcdRelation(RelationContext):
+    interface = 'etcd'
+    required_keys = ['hostname', 'port']
+
+
+class CloudControllerRelation(RelationContext):
+    interface = 'cc'
+    required_keys = ['hostname', 'port', 'user', 'password']

=== modified file 'charmhelpers/contrib/jujugui/utils.py'
--- charmhelpers/contrib/jujugui/utils.py	2013-07-16 19:58:14 +0000
+++ charmhelpers/contrib/jujugui/utils.py	2014-06-12 20:35:48 +0000
@@ -257,30 +257,31 @@
         stream.write(template.substitute(context))
 
 
-results_log = None
+gui_logger = None
 
 
 def _setupLogging():
-    global results_log
-    if results_log is not None:
+    global gui_logger
+    if gui_logger is not None:
         return
     cfg = config()
     logging.basicConfig(
         filename=cfg['command-log-file'],
+        filemode='a',
         level=logging.INFO,
         format="%(asctime)s: %(name)s@%(levelname)s %(message)s")
-    results_log = logging.getLogger('juju-gui')
+    gui_logger = logging.getLogger('juju-gui')
 
 
 def cmd_log(results):
-    global results_log
+    global gui_logger
     if not results:
         return
-    if results_log is None:
+    if gui_logger is None:
         _setupLogging()
     # Since 'results' may be multi-line output, start it on a separate line
     # from the logger timestamp, etc.
-    results_log.info('\n' + results)
+    gui_logger.info('\n' + results)
 
 
 def start_improv(staging_env, ssl_cert_path,
@@ -293,7 +294,8 @@
         'port': API_PORT,
         'staging_env': staging_env,
     }
-    render_to_file('config/juju-api-improv.conf.template', context, config_path)
+    render_to_file('config/juju-api-improv.conf.template',
+                   context, config_path)
     log('Starting the staging backend.')
     with su('root'):
         service_start(IMPROV)

=== modified file 'charmhelpers/contrib/openstack/templating.py'
--- charmhelpers/contrib/openstack/templating.py	2014-02-24 19:28:25 +0000
+++ charmhelpers/contrib/openstack/templating.py	2014-06-12 20:35:48 +0000
@@ -96,7 +96,7 @@
 
     def complete_contexts(self):
         '''
-        Return a list of interfaces that have atisfied contexts.
+        Return a list of interfaces that have satisfied contexts.
         '''
         if self._complete_contexts:
             return self._complete_contexts

=== modified file 'charmhelpers/core/host.py'
--- charmhelpers/core/host.py	2014-05-23 18:00:31 +0000
+++ charmhelpers/core/host.py	2014-06-12 20:35:48 +0000
@@ -12,7 +12,9 @@
 import string
 import subprocess
 import hashlib
+import shutil
 import apt_pkg
+from contextlib import contextmanager
 
 from collections import OrderedDict
 
@@ -63,6 +65,11 @@
             return False
 
 
+def service_available(service_name):
+    """Determine whether a system service is available"""
+    return service('status', service_name)
+
+
 def adduser(username, password=None, shell='/bin/bash', system_user=False):
     """Add a user to the system"""
     try:
@@ -146,6 +153,7 @@
         target.write(content)
 
 
+<<<<<<< TREE
 def fstab_remove(mp):
     """Remove the given mountpoint entry from /etc/fstab
     """
@@ -159,6 +167,19 @@
 
 
 def mount(device, mountpoint, options=None, persist=False, filesystem="ext3"):
+=======
+def copy_file(src, dst, owner='root', group='root', perms=0444):
+    """Create or overwrite a file with the contents of another file"""
+    log("Writing file {} {}:{} {:o} from {}".format(dst, owner, group, perms, src))
+    uid = pwd.getpwnam(owner).pw_uid
+    gid = grp.getgrnam(group).gr_gid
+    shutil.copyfile(src, dst)
+    os.chown(dst, uid, gid)
+    os.chmod(dst, perms)
+
+
+def mount(device, mountpoint, options=None, persist=False):
+>>>>>>> MERGE-SOURCE
     """Mount a filesystem at a particular mountpoint"""
     cmd_args = ['mount']
     if options is not None:
@@ -323,3 +344,24 @@
         pkgcache = apt_pkg.Cache()
     pkg = pkgcache[package]
     return apt_pkg.version_compare(pkg.current_ver.ver_str, revno)
+
+
+@contextmanager
+def chdir(d):
+    cur = os.getcwd()
+    try:
+        yield os.chdir(d)
+    finally:
+        os.chdir(cur)
+
+
+def chownr(path, owner, group):
+    uid = pwd.getpwnam(owner).pw_uid
+    gid = grp.getgrnam(group).gr_gid
+
+    for root, dirs, files in os.walk(path):
+        for name in dirs + files:
+            full = os.path.join(root, name)
+            broken_symlink = os.path.lexists(full) and not os.path.exists(full)
+            if not broken_symlink:
+                os.chown(full, uid, gid)

=== added file 'charmhelpers/core/services.py'
--- charmhelpers/core/services.py	1970-01-01 00:00:00 +0000
+++ charmhelpers/core/services.py	2014-06-12 20:35:48 +0000
@@ -0,0 +1,358 @@
+import os
+import sys
+from collections import Iterable
+from charmhelpers.core import templating
+from charmhelpers.core import host
+from charmhelpers.core import hookenv
+
+
+class ServiceManager(object):
+    def __init__(self, services=None):
+        """
+        Register a list of services, given their definitions.
+
+        Traditional charm authoring is focused on implementing hooks.  That is,
+        the charm author is thinking in terms of "What hook am I handling; what
+        does this hook need to do?"  However, in most cases, the real question
+        should be "Do I have the information I need to configure and start this
+        piece of software and, if so, what are the steps for doing so."  The
+        ServiceManager framework tries to bring the focus to the data and the
+        setup tasks, in the most declarative way possible.
+
+        Service definitions are dicts in the following formats (all keys except
+        'service' are optional):
+
+            {
+                "service": <service name>,
+                "required_data": <list of required data contexts>,
+                "data_ready": <one or more callbacks>,
+                "data_lost": <one or more callbacks>,
+                "start": <one or more callbacks>,
+                "stop": <one or more callbacks>,
+                "ports": <list of ports to manage>,
+            }
+
+        The 'required_data' list should contain dicts of required data (or
+        dependency managers that act like dicts and know how to collect the data).
+        Only when all items in the 'required_data' list are populated are the list
+        of 'data_ready' and 'start' callbacks executed.  See `is_ready()` for more
+        information.
+
+        The 'data_ready' value should be either a single callback, or a list of
+        callbacks, to be called when all items in 'required_data' pass `is_ready()`.
+        Each callback will be called with the service name as the only parameter.
+        After these all of the 'data_ready' callbacks are called, the 'start'
+        callbacks are fired.
+
+        The 'data_lost' value should be either a single callback, or a list of
+        callbacks, to be called when a 'required_data' item no longer passes
+        `is_ready()`.  Each callback will be called with the service name as the
+        only parameter.  After these all of the 'data_ready' callbacks are called,
+        the 'stop' callbacks are fired.
+
+        The 'start' value should be either a single callback, or a list of
+        callbacks, to be called when starting the service, after the 'data_ready'
+        callbacks are complete.  Each callback will be called with the service
+        name as the only parameter.  This defaults to
+        `[host.service_start, services.open_ports]`.
+
+        The 'stop' value should be either a single callback, or a list of
+        callbacks, to be called when stopping the service.  If the service is
+        being stopped because it no longer has all of its 'required_data', this
+        will be called after all of the 'data_lost' callbacks are complete.
+        Each callback will be called with the service name as the only parameter.
+        This defaults to `[services.close_ports, host.service_stop]`.
+
+        The 'ports' value should be a list of ports to manage.  The default
+        'start' handler will open the ports after the service is started,
+        and the default 'stop' handler will close the ports prior to stopping
+        the service.
+
+
+        Examples:
+
+        The following registers an Upstart service called bingod that depends on
+        a mongodb relation and which runs a custom `db_migrate` function prior to
+        restarting the service, and a Runit serivce called spadesd.
+
+            manager = services.ServiceManager([
+                {
+                    'service': 'bingod',
+                    'ports': [80, 443],
+                    'required_data': [MongoRelation(), config(), {'my': 'data'}],
+                    'data_ready': [
+                        services.template(source='bingod.conf'),
+                        services.template(source='bingod.ini',
+                                          target='/etc/bingod.ini',
+                                          owner='bingo', perms=0400),
+                    ],
+                },
+                {
+                    'service': 'spadesd',
+                    'data_ready': services.template(source='spadesd_run.j2',
+                                                    target='/etc/sv/spadesd/run',
+                                                    perms=0555),
+                    'start': runit_start,
+                    'stop': runit_stop,
+                },
+            ])
+            manager.manage()
+        """
+        self.services = {}
+        for service in services or []:
+            service_name = service['service']
+            self.services[service_name] = service
+
+    def manage(self):
+        """
+        Handle the current hook by doing The Right Thing with the registered services.
+        """
+        hook_name = os.path.basename(sys.argv[0])
+        if hook_name == 'stop':
+            self.stop_services()
+        else:
+            self.reconfigure_services()
+
+    def reconfigure_services(self, *service_names):
+        """
+        Update all files for one or more registered services, and,
+        if ready, optionally restart them.
+
+        If no service names are given, reconfigures all registered services.
+        """
+        for service_name in service_names or self.services.keys():
+            if self.is_ready(service_name):
+                self.fire_event('data_ready', service_name)
+                self.fire_event('start', service_name, default=[
+                    host.service_restart,
+                    open_ports])
+                self.save_ready(service_name)
+            else:
+                if self.was_ready(service_name):
+                    self.fire_event('data_lost', service_name)
+                self.fire_event('stop', service_name, default=[
+                    close_ports,
+                    host.service_stop])
+                self.save_lost(service_name)
+
+    def stop_services(self, *service_names):
+        """
+        Stop one or more registered services, by name.
+
+        If no service names are given, stops all registered services.
+        """
+        for service_name in service_names or self.services.keys():
+            self.fire_event('stop', service_name, default=[
+                close_ports,
+                host.service_stop])
+
+    def get_service(self, service_name):
+        """
+        Given the name of a registered service, return its service definition.
+        """
+        service = self.services.get(service_name)
+        if not service:
+            raise KeyError('Service not registered: %s' % service_name)
+        return service
+
+    def fire_event(self, event_name, service_name, default=None):
+        """
+        Fire a data_ready, data_lost, start, or stop event on a given service.
+        """
+        service = self.get_service(service_name)
+        callbacks = service.get(event_name, default)
+        if not callbacks:
+            return
+        if not isinstance(callbacks, Iterable):
+            callbacks = [callbacks]
+        for callback in callbacks:
+            if isinstance(callback, ManagerCallback):
+                callback(self, service_name, event_name)
+            else:
+                callback(service_name)
+
+    def is_ready(self, service_name):
+        """
+        Determine if a registered service is ready, by checking its 'required_data'.
+
+        A 'required_data' item can be any mapping type, and is considered ready
+        if `bool(item)` evaluates as True.
+        """
+        service = self.get_service(service_name)
+        reqs = service.get('required_data', [])
+        return all(bool(req) for req in reqs)
+
+    def save_ready(self, service_name):
+        """
+        Save an indicator that the given service is now data_ready.
+        """
+        ready_file = '{}/.ready.{}'.format(hookenv.charm_dir(), service_name)
+        with open(ready_file, 'a'):
+            pass
+
+    def save_lost(self, service_name):
+        """
+        Save an indicator that the given service is no longer data_ready.
+        """
+        ready_file = '{}/.ready.{}'.format(hookenv.charm_dir(), service_name)
+        if os.path.exists(ready_file):
+            os.remove(ready_file)
+
+    def was_ready(self, service_name):
+        """
+        Determine if the given service was previously data_ready.
+        """
+        ready_file = '{}/.ready.{}'.format(hookenv.charm_dir(), service_name)
+        return os.path.exists(ready_file)
+
+
+class RelationContext(dict):
+    """
+    Base class for a context generator that gets relation data from juju.
+
+    Subclasses must provide `interface`, which is the interface type of interest,
+    and `required_keys`, which is the set of keys required for the relation to
+    be considered complete.  The first relation for the interface that is complete
+    will be used to populate the data for template.
+
+    The generated context will be namespaced under the interface type, to prevent
+    potential naming conflicts.
+    """
+    interface = None
+    required_keys = []
+
+    def __init__(self, *args, **kwargs):
+        super(RelationContext, self).__init__(*args, **kwargs)
+        self.get_data()
+
+    def __bool__(self):
+        """
+        Returns True if all of the required_keys are available.
+        """
+        return self.is_ready()
+
+    __nonzero__ = __bool__
+
+    def __repr__(self):
+        return super(RelationContext, self).__repr__()
+
+    def is_ready(self):
+        """
+        Returns True if all of the `required_keys` are available from any units.
+        """
+        ready = len(self.get(self.interface, [])) > 0
+        if not ready:
+            hookenv.log('Incomplete relation: {}'.format(self.__class__.__name__), hookenv.DEBUG)
+        return ready
+
+    def _is_ready(self, unit_data):
+        """
+        Helper method that tests a set of relation data and returns True if
+        all of the `required_keys` are present.
+        """
+        return set(unit_data.keys()).issuperset(set(self.required_keys))
+
+    def get_data(self):
+        """
+        Retrieve the relation data for each unit involved in a realtion and,
+        if complete, store it in a list under `self[self.interface]`.  This
+        is automatically called when the RelationContext is instantiated.
+
+        The units are sorted lexographically first by the service ID, then by
+        the unit ID.  Thus, if an interface has two other services, 'db:1'
+        and 'db:2', with 'db:1' having two units, 'wordpress/0' and 'wordpress/1',
+        and 'db:2' having one unit, 'mediawiki/0', all of which have a complete
+        set of data, the relation data for the units will be stored in the
+        order: 'wordpress/0', 'wordpress/1', 'mediawiki/0'.
+
+        If you only care about a single unit on the relation, you can just
+        access it as `{{ interface[0]['key'] }}`.  However, if you can at all
+        support multiple units on a relation, you should iterate over the list,
+        like:
+
+            {% for unit in interface -%}
+                {{ unit['key'] }}{% if not loop.last %},{% endif %}
+            {%- endfor %}
+
+        Note that since all sets of relation data from all related services and
+        units are in a single list, if you need to know which service or unit a
+        set of data came from, you'll need to extend this class to preserve
+        that information.
+        """
+        if not hookenv.relation_ids(self.interface):
+            return
+
+        ns = self.setdefault(self.interface, [])
+        for rid in sorted(hookenv.relation_ids(self.interface)):
+            for unit in sorted(hookenv.related_units(rid)):
+                reldata = hookenv.relation_get(rid=rid, unit=unit)
+                if self._is_ready(reldata):
+                    ns.append(reldata)
+
+
+class ManagerCallback(object):
+    """
+    Special case of a callback that takes the `ServiceManager` instance
+    in addition to the service name.
+
+    Subclasses should implement `__call__` which should accept two parameters:
+
+        * `manager`       The `ServiceManager` instance
+        * `service_name`  The name of the service it's being triggered for
+        * `event_name`    The name of the event that this callback is handling
+    """
+    def __call__(self, manager, service_name, event_name):
+        raise NotImplementedError()
+
+
+class TemplateCallback(ManagerCallback):
+    """
+    Callback class that will render a template, for use as a ready action.
+
+    The `target` param, if omitted, will default to `/etc/init/<service name>`.
+    """
+    def __init__(self, source, target, owner='root', group='root', perms=0444):
+        self.source = source
+        self.target = target
+        self.owner = owner
+        self.group = group
+        self.perms = perms
+
+    def __call__(self, manager, service_name, event_name):
+        service = manager.get_service(service_name)
+        context = {}
+        for ctx in service.get('required_data', []):
+            context.update(ctx)
+        templating.render(self.source, self.target, context,
+                          self.owner, self.group, self.perms)
+
+
+class PortManagerCallback(ManagerCallback):
+    """
+    Callback class that will open or close ports, for use as either
+    a start or stop action.
+    """
+    def __call__(self, manager, service_name, event_name):
+        service = manager.get_service(service_name)
+        new_ports = service.get('ports', [])
+        port_file = os.path.join(hookenv.charm_dir(), '.{}.ports'.format(service_name))
+        if os.path.exists(port_file):
+            with open(port_file) as fp:
+                old_ports = fp.read().split(',')
+            for old_port in old_ports:
+                old_port = int(old_port)
+                if old_port not in new_ports:
+                    hookenv.close_port(old_port)
+        with open(port_file, 'w') as fp:
+            fp.write(','.join(str(port) for port in new_ports))
+        for port in new_ports:
+            if event_name == 'start':
+                hookenv.open_port(port)
+            elif event_name == 'stop':
+                hookenv.close_port(port)
+
+
+# Convenience aliases
+render_template = template = TemplateCallback
+open_ports = PortManagerCallback()
+close_ports = PortManagerCallback()

=== added file 'charmhelpers/core/templating.py'
--- charmhelpers/core/templating.py	1970-01-01 00:00:00 +0000
+++ charmhelpers/core/templating.py	2014-06-12 20:35:48 +0000
@@ -0,0 +1,51 @@
+import os
+
+from charmhelpers.core import host
+from charmhelpers.core import hookenv
+
+
+def render(source, target, context, owner='root', group='root', perms=0444, templates_dir=None):
+    """
+    Render a template.
+
+    The `source` path, if not absolute, is relative to the `templates_dir`.
+
+    The `target` path should be absolute.
+
+    The context should be a dict containing the values to be replaced in the
+    template.
+
+    The `owner`, `group`, and `perms` options will be passed to `write_file`.
+
+    If omitted, `templates_dir` defaults to the `templates` folder in the charm.
+
+    Note: Using this requires python-jinja2; if it is not installed, calling
+    this will attempt to use charmhelpers.fetch.apt_install to install it.
+    """
+    try:
+        from jinja2 import FileSystemLoader, Environment, exceptions
+    except ImportError:
+        try:
+            from charmhelpers.fetch import apt_install
+        except ImportError:
+            hookenv.log('Could not import jinja2, and could not import '
+                        'charmhelpers.fetch to install it',
+                        level=hookenv.ERROR)
+            raise
+        apt_install('python-jinja2', fatal=True)
+        from jinja2 import FileSystemLoader, Environment, exceptions
+
+    if templates_dir is None:
+        templates_dir = os.path.join(hookenv.charm_dir(), 'templates')
+    loader = Environment(loader=FileSystemLoader(templates_dir))
+    try:
+        source = source
+        template = loader.get_template(source)
+    except exceptions.TemplateNotFound as e:
+        hookenv.log('Could not load template %s from %s.' %
+                    (source, templates_dir),
+                    level=hookenv.ERROR)
+        raise e
+    content = template.render(context)
+    host.mkdir(os.path.dirname(target))
+    host.write_file(target, content, owner, group, perms)

=== modified file 'charmhelpers/fetch/bzrurl.py'
--- charmhelpers/fetch/bzrurl.py	2014-05-22 22:46:54 +0000
+++ charmhelpers/fetch/bzrurl.py	2014-06-12 20:35:48 +0000
@@ -4,13 +4,7 @@
     UnhandledSource
 )
 from charmhelpers.core.host import mkdir
-
-try:
-    from bzrlib.branch import Branch
-except ImportError:
-    from charmhelpers.fetch import apt_install
-    apt_install("python-bzrlib")
-    from bzrlib.branch import Branch
+from bzrlib.branch import Branch
 
 
 class BzrUrlFetchHandler(BaseFetchHandler):

=== added directory 'deps'
=== added file 'deps/Jinja2-2.7.2-py2-none-any.whl'
Binary files deps/Jinja2-2.7.2-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/Jinja2-2.7.2-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/Tempita-0.5.2-py2-none-any.whl'
Binary files deps/Tempita-0.5.2-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/Tempita-0.5.2-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/extras-0.0.3-py2-none-any.whl'
Binary files deps/extras-0.0.3-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/extras-0.0.3-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/httplib2-0.8-py2-none-any.whl'
Binary files deps/httplib2-0.8-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/httplib2-0.8-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/keyring-3.7-py2-none-any.whl'
Binary files deps/keyring-3.7-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/keyring-3.7-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/launchpadlib-1.10.2-py2-none-any.whl'
Binary files deps/launchpadlib-1.10.2-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/launchpadlib-1.10.2-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/lazr.authentication-0.1.2-py2-none-any.whl'
Binary files deps/lazr.authentication-0.1.2-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/lazr.authentication-0.1.2-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/lazr.restfulclient-0.13.3-py2-none-any.whl'
Binary files deps/lazr.restfulclient-0.13.3-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/lazr.restfulclient-0.13.3-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/lazr.uri-1.0.3-py2-none-any.whl'
Binary files deps/lazr.uri-1.0.3-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/lazr.uri-1.0.3-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/mock-1.0.1-py2-none-any.whl'
Binary files deps/mock-1.0.1-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/mock-1.0.1-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/netaddr-0.7.11-py2-none-any.whl'
Binary files deps/netaddr-0.7.11-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/netaddr-0.7.11-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/nose-1.3.1-py2-none-any.whl'
Binary files deps/nose-1.3.1-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/nose-1.3.1-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/oauth-1.0.1-py2-none-any.whl'
Binary files deps/oauth-1.0.1-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/oauth-1.0.1-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/python_mimeparse-0.1.4-py2-none-any.whl'
Binary files deps/python_mimeparse-0.1.4-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/python_mimeparse-0.1.4-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/setuptools-3.4.1-py2.py3-none-any.whl'
Binary files deps/setuptools-3.4.1-py2.py3-none-any.whl	1970-01-01 00:00:00 +0000 and deps/setuptools-3.4.1-py2.py3-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/shelltoolbox-0.2.1-py2-none-any.whl'
Binary files deps/shelltoolbox-0.2.1-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/shelltoolbox-0.2.1-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/shelltoolbox-0.2.1-py2.7.egg'
Binary files deps/shelltoolbox-0.2.1-py2.7.egg	1970-01-01 00:00:00 +0000 and deps/shelltoolbox-0.2.1-py2.7.egg	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/testresources-0.2.7-py2-none-any.whl'
Binary files deps/testresources-0.2.7-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/testresources-0.2.7-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/testtools-0.9.35-py2-none-any.whl'
Binary files deps/testtools-0.9.35-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/testtools-0.9.35-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/wadllib-1.3.2-py2-none-any.whl'
Binary files deps/wadllib-1.3.2-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/wadllib-1.3.2-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/wsgi_intercept-0.6.1-py2-none-any.whl'
Binary files deps/wsgi_intercept-0.6.1-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/wsgi_intercept-0.6.1-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'deps/wsgiref-0.1.2-py2-none-any.whl'
Binary files deps/wsgiref-0.1.2-py2-none-any.whl	1970-01-01 00:00:00 +0000 and deps/wsgiref-0.1.2-py2-none-any.whl	2014-06-12 20:35:48 +0000 differ
=== added file 'install-test-deps.sh'
--- install-test-deps.sh	1970-01-01 00:00:00 +0000
+++ install-test-deps.sh	2014-06-12 20:35:48 +0000
@@ -0,0 +1,8 @@
+#!/bin/bash
+pip install --pre \
+--allow-all-external \
+--allow-unverified launchpadlib \
+--allow-unverified python-apt \
+--allow-unverified bzr \
+--allow-unverified lazr.authentication \
+-r ./test-requirements-tox.txt

=== modified file 'setup.cfg'
--- setup.cfg	2013-07-16 21:30:42 +0000
+++ setup.cfg	2014-06-12 20:35:48 +0000
@@ -1,4 +1,6 @@
 [nosetests]
-with-coverage=1
 cover-erase=1
 cover-package=charmhelpers,tools
+cover-html=1
+cover-html-dir=cover
+nologcapture=1

=== modified file 'setup.py'
--- setup.py	2013-11-07 09:07:53 +0000
+++ setup.py	2014-06-12 20:35:48 +0000
@@ -1,4 +1,4 @@
-from distutils.core import setup
+from setuptools import setup
 import os
 
 
@@ -29,6 +29,11 @@
         "charmhelpers.contrib.jujugui",
         "charmhelpers.contrib.templating",
     ],
+    'entry_points':"""
+    [console_scripts]
+    ch-sync = charmhelpers.chsync.charm_helpers_sync:main
+    charm-helpers-sync = charmhelpers.chsync.charm_helpers_sync:main
+    """,
     'scripts': [
         "bin/chlp",
         "bin/contrib/charmsupport/charmsupport",
@@ -41,3 +46,4 @@
 
 if __name__ == '__main__':
     setup(**SETUP)
+

=== added file 'test-requirements-tox.txt'
--- test-requirements-tox.txt	1970-01-01 00:00:00 +0000
+++ test-requirements-tox.txt	2014-06-12 20:35:48 +0000
@@ -0,0 +1,18 @@
+--find-links=./deps/
+shelltoolbox==0.2.1
+coverage==3.7.1
+Jinja2==2.7.2
+MarkupSafe==0.19
+mock==1.0.1
+netaddr==0.7.11
+netifaces==0.8
+nose==1.3.1
+python-apt
+python-mimeparse==0.1.4
+PyYAML==3.11
+Tempita==0.5.2
+testtools==0.9.35
+wsgiref==0.1.2
+launchpadlib==1.10.2
+bzr==2.6.0
+ipdb

=== modified file 'test_requirements.txt'
--- test_requirements.txt	2014-06-02 12:18:22 +0000
+++ test_requirements.txt	2014-06-12 20:35:48 +0000
@@ -1,3 +1,4 @@
+<<<<<<< TREE
 # If you don't have the required packages on your system and prefer
 # to keep your system clean, you can do the following to run the
 # charm-helper test suite:
@@ -8,15 +9,27 @@
 # Then `make test` should find all its dependencies.
 coverage==3.6
 launchpadlib==1.10.2
+=======
+shelltoolbox==0.2.1
+coverage==3.7.1
+Jinja2==2.7.2
+MarkupSafe==0.19
+>>>>>>> MERGE-SOURCE
 mock==1.0.1
+<<<<<<< TREE
 nose==1.3.1
 PyYAML==3.10
+=======
+netaddr==0.7.11
+netifaces==0.8
+nose==1.3.1
+>>>>>>> MERGE-SOURCE
 python-apt
-simplejson==3.3.0
-testtools
-Tempita==0.5.1
-bzr+http://bazaar.launchpad.net/~yellow/python-shelltoolbox/trunk@17#egg=shelltoolbox
-http://alastairs-place.net/projects/netifaces/netifaces-0.6.tar.gz
-netaddr==0.7.5
+python-mimeparse==0.1.4
+PyYAML==3.11
+Tempita==0.5.2
+testtools==0.9.35
+wsgiref==0.1.2
+launchpadlib==1.10.2
 bzr==2.6.0
 Jinja2==2.7.2

=== added directory 'tests/contrib/cloudfoundry'
=== added file 'tests/contrib/cloudfoundry/__init__.py'
=== added directory 'tests/contrib/cloudfoundry/files'
=== added file 'tests/contrib/cloudfoundry/files/test.conf'
--- tests/contrib/cloudfoundry/files/test.conf	1970-01-01 00:00:00 +0000
+++ tests/contrib/cloudfoundry/files/test.conf	2014-06-12 20:35:48 +0000
@@ -0,0 +1,1 @@
+Static config file example

=== added file 'tests/contrib/cloudfoundry/test_render_context.py'
--- tests/contrib/cloudfoundry/test_render_context.py	1970-01-01 00:00:00 +0000
+++ tests/contrib/cloudfoundry/test_render_context.py	2014-06-12 20:35:48 +0000
@@ -0,0 +1,89 @@
+import mock
+import unittest
+import os
+import tempfile
+from charmhelpers.contrib.cloudfoundry import contexts
+
+
+class TestNatsRelation(unittest.TestCase):
+
+    @mock.patch('charmhelpers.core.hookenv.relation_ids')
+    def test_nats_relation_empty(self, mid):
+        mid.return_value = None
+        n = contexts.NatsRelation()
+        self.assertEqual(n, {})
+
+    @mock.patch('charmhelpers.core.hookenv.related_units')
+    @mock.patch('charmhelpers.core.hookenv.relation_ids')
+    @mock.patch('charmhelpers.core.hookenv.relation_get')
+    def test_nats_relation_populated(self, mrel, mid, mrelated):
+        mid.return_value = ['nats']
+        mrel.return_value = {'port': 1234, 'address': 'host',
+                             'user': 'user', 'password': 'password'}
+        mrelated.return_value = ['router/0']
+        n = contexts.NatsRelation()
+        expected = {'nats': [{'port': 1234, 'address': 'host',
+                              'user': 'user', 'password': 'password'}]}
+        self.assertTrue(bool(n))
+        self.assertEqual(n, expected)
+        self.assertEqual(n['nats'][0]['port'], 1234)
+
+    @mock.patch('charmhelpers.core.hookenv.related_units')
+    @mock.patch('charmhelpers.core.hookenv.relation_ids')
+    @mock.patch('charmhelpers.core.hookenv.relation_get')
+    def test_nats_relation_partial(self, mrel, mid, mrelated):
+        mid.return_value = ['nats']
+        mrel.return_value = {'address': 'host'}
+        mrelated.return_value = ['router/0']
+        n = contexts.NatsRelation()
+        self.assertEqual(n, {'nats': []})
+
+
+class TestRouterRelation(unittest.TestCase):
+
+    @mock.patch('charmhelpers.core.hookenv.relation_ids')
+    def test_router_relation_empty(self, mid):
+        mid.return_value = None
+        n = contexts.RouterRelation()
+        self.assertEqual(n, {})
+
+    @mock.patch('charmhelpers.core.hookenv.related_units')
+    @mock.patch('charmhelpers.core.hookenv.relation_ids')
+    @mock.patch('charmhelpers.core.hookenv.relation_get')
+    def test_router_relation_populated(self, mrel, mid, mrelated):
+        mid.return_value = ['router']
+        mrel.return_value = {'domain': 'example.com'}
+        mrelated.return_value = ['router/0']
+        n = contexts.RouterRelation()
+        expected = {'router': [{'domain': 'example.com'}]}
+        self.assertTrue(bool(n))
+        self.assertEqual(n, expected)
+        self.assertEqual(n['router'][0]['domain'], 'example.com')
+
+
+class TestStoredContext(unittest.TestCase):
+
+    def test_context_saving(self):
+        _, file_name = tempfile.mkstemp()
+        os.unlink(file_name)
+        self.assertFalse(os.path.isfile(file_name))
+        contexts.StoredContext(file_name, {'key': 'value'})
+        self.assertTrue(os.path.isfile(file_name))
+
+    def test_restoring(self):
+        _, file_name = tempfile.mkstemp()
+        os.unlink(file_name)
+        contexts.StoredContext(file_name, {'key': 'initial_value'})
+        self.assertTrue(os.path.isfile(file_name))
+        context = contexts.StoredContext(file_name, {'key': 'random_value'})
+        self.assertIn('key', context)
+        self.assertEqual(context['key'], 'initial_value')
+
+    def test_stored_context_raise(self):
+        _, file_name = tempfile.mkstemp()
+        with self.assertRaises(OSError):
+            contexts.StoredContext(file_name, {'key': 'initial_value'})
+        os.unlink(file_name)
+
+if __name__ == '__main__':
+    unittest.main()

=== added directory 'tests/core/templates'
=== added file 'tests/core/templates/cloud_controller_ng.yml'
--- tests/core/templates/cloud_controller_ng.yml	1970-01-01 00:00:00 +0000
+++ tests/core/templates/cloud_controller_ng.yml	2014-06-12 20:35:48 +0000
@@ -0,0 +1,173 @@
+---
+# TODO cc_ip cc public ip 
+local_route: {{ domain }}
+port: {{ cc_port }}
+pid_filename: /var/vcap/sys/run/cloud_controller_ng/cloud_controller_ng.pid
+development_mode: false
+
+message_bus_servers:
+  - nats://{{ nats_user }}:{{ nats_password }}@{{ nats_address }}:{{ nats_port }}
+
+external_domain:
+  - api.{{ domain }}
+
+system_domain_organization: {{ default_organization }}
+system_domain: {{ domain }}
+app_domains: [ {{ domain }} ]
+srv_api_uri: http://api.{{ domain }}
+
+default_app_memory: 1024
+
+cc_partition: default
+
+bootstrap_admin_email: admin@{{ default_organization }}
+
+bulk_api:
+  auth_user: bulk_api
+  auth_password: "Password"
+
+nginx:
+  use_nginx: false
+  instance_socket: "/var/vcap/sys/run/cloud_controller_ng/cloud_controller.sock"
+
+index: 1
+name: cloud_controller_ng
+
+info:
+  name: vcap
+  build: "2222"
+  version: 2
+  support_address: http://support.cloudfoundry.com
+  description: Cloud Foundry sponsored by Pivotal
+  api_version: 2.0.0
+
+
+directories:
+ tmpdir: /var/vcap/data/cloud_controller_ng/tmp
+
+
+logging:
+  file: /var/vcap/sys/log/cloud_controller_ng/cloud_controller_ng.log
+
+  syslog: vcap.cloud_controller_ng
+
+  level: debug2
+  max_retries: 1
+
+
+
+
+
+db: &db
+  database: sqlite:///var/lib/cloudfoundry/cfcloudcontroller/db/cc.db
+  max_connections: 25
+  pool_timeout: 10
+  log_level: debug2
+
+
+login:
+  url: http://uaa.{{ domain }}
+
+uaa:
+  url: http://uaa.{{ domain }}
+  resource_id: cloud_controller
+  #symmetric_secret: cc-secret
+  verification_key: |
+    -----BEGIN PUBLIC KEY-----
+    MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDHFr+KICms+tuT1OXJwhCUmR2d
+    KVy7psa8xzElSyzqx7oJyfJ1JZyOzToj9T5SfTIq396agbHJWVfYphNahvZ/7uMX
+    qHxf+ZH9BL1gk9Y6kCnbM5R60gfwjyW1/dQPjOzn9N394zd2FJoFHwdq9Qs0wBug
+    spULZVNRxq7veq/fzwIDAQAB
+    -----END PUBLIC KEY-----
+
+# App staging parameters
+staging:
+  max_staging_runtime: 900
+  auth:
+    user:
+    password: "Password"
+
+maximum_health_check_timeout: 180
+
+runtimes_file: /var/lib/cloudfoundry/cfcloudcontroller/jobs/config/runtimes.yml
+stacks_file: /var/lib/cloudfoundry/cfcloudcontroller/jobs/config/stacks.yml
+
+quota_definitions:
+  free:
+    non_basic_services_allowed: false
+    total_services: 2
+    total_routes: 1000
+    memory_limit: 1024
+  paid:
+    non_basic_services_allowed: true
+    total_services: 32
+    total_routes: 1000
+    memory_limit: 204800
+  runaway:
+    non_basic_services_allowed: true
+    total_services: 500
+    total_routes: 1000
+    memory_limit: 204800
+  trial:
+    non_basic_services_allowed: false
+    total_services: 10
+    memory_limit: 2048
+    total_routes: 1000
+    trial_db_allowed: true
+
+default_quota_definition: free
+
+resource_pool:
+  minimum_size: 65536
+  maximum_size: 536870912
+  resource_directory_key: cc-resources
+
+  cdn:
+    uri:
+    key_pair_id:
+    private_key: ""
+
+  fog_connection: {"provider":"Local","local_root":"/var/vcap/nfs/store"}
+
+packages:
+  app_package_directory_key: cc-packages
+
+  cdn:
+    uri:
+    key_pair_id:
+    private_key: ""
+
+  fog_connection: {"provider":"Local","local_root":"/var/vcap/nfs/store"}
+
+droplets:
+  droplet_directory_key: cc-droplets
+
+  cdn:
+    uri:
+    key_pair_id:
+    private_key: ""
+
+  fog_connection: {"provider":"Local","local_root":"/var/vcap/nfs/store"}
+
+buildpacks:
+  buildpack_directory_key: cc-buildpacks
+
+  cdn:
+    uri:
+    key_pair_id:
+    private_key: ""
+
+  fog_connection: {"provider":"Local","local_root":"/var/vcap/nfs/store"}
+
+db_encryption_key: Password
+
+trial_db:
+  guid: "78ad16cf-3c22-4427-a982-b9d35d746914"
+
+tasks_disabled: false
+hm9000_noop: true
+flapping_crash_count_threshold: 3
+
+disable_custom_buildpacks: false
+
+broker_client_timeout_seconds: 60

=== added file 'tests/core/templates/fake_cc.yml'
--- tests/core/templates/fake_cc.yml	1970-01-01 00:00:00 +0000
+++ tests/core/templates/fake_cc.yml	2014-06-12 20:35:48 +0000
@@ -0,0 +1,3 @@
+host: {{nats['nats_host']}}
+port: {{nats['nats_port']}}
+domain: {{router['domain']}}

=== added file 'tests/core/templates/nginx.conf'
--- tests/core/templates/nginx.conf	1970-01-01 00:00:00 +0000
+++ tests/core/templates/nginx.conf	2014-06-12 20:35:48 +0000
@@ -0,0 +1,154 @@
+# deployment cloudcontroller nginx.conf
+#user  vcap vcap;
+
+error_log /var/vcap/sys/log/nginx_ccng/nginx.error.log;
+pid       /var/vcap/sys/run/nginx_ccng/nginx.pid;
+
+events {
+  worker_connections  8192;
+  use epoll;
+}
+
+http {
+  include       mime.types;
+  default_type  text/html;
+  server_tokens off;
+  variables_hash_max_size 1024;
+
+  log_format main  '$host - [$time_local] '
+                   '"$request" $status $bytes_sent '
+                   '"$http_referer" "$http_#user_agent" '
+                   '$proxy_add_x_forwarded_for response_time:$upstream_response_time';
+
+  access_log  /var/vcap/sys/log/nginx_ccng/nginx.access.log  main;
+
+  sendfile             on;  #enable use of sendfile()
+  tcp_nopush           on;
+  tcp_nodelay          on;  #disable nagel's algorithm
+
+  keepalive_timeout  75 20; #inherited from router
+
+  client_max_body_size 256M; #already enforced upstream/but doesn't hurt.
+
+  upstream cloud_controller {
+    server unix:/var/vcap/sys/run/cloud_controller_ng/cloud_controller.sock;
+  }
+
+  server {
+    listen    {{ nginx_port }};
+    server_name  _;
+    server_name_in_redirect off;
+    proxy_send_timeout          300;
+    proxy_read_timeout          300;
+
+    # proxy and log all CC traffic
+    location / {
+      access_log /var/vcap/sys/log/nginx_ccng/nginx.access.log  main;
+      proxy_buffering             off;
+      proxy_set_header            Host $host;
+      proxy_set_header            X-Real_IP $remote_addr;
+      proxy_set_header            X-Forwarded-For $proxy_add_x_forwarded_for;
+      proxy_redirect              off;
+      proxy_connect_timeout       10;
+      proxy_pass                 http://cloud_controller;
+    }
+
+
+    # used for x-accel-redirect uri://location/foo.txt
+    # nginx will serve the file root || location || foo.txt
+    location /droplets/ {
+      internal;
+      root   /var/vcap/nfs/store;
+    }
+
+
+
+    # used for x-accel-redirect uri://location/foo.txt
+    # nginx will serve the file root || location || foo.txt
+    location /cc-packages/ {
+      internal;
+      root   /var/vcap/nfs/store;
+    }
+
+
+    # used for x-accel-redirect uri://location/foo.txt
+    # nginx will serve the file root || location || foo.txt
+    location /cc-droplets/ {
+      internal;
+      root   /var/vcap/nfs/store;
+    }
+
+
+    location ~ (/apps/.*/application|/v2/apps/.*/bits|/services/v\d+/configurations/.*/serialized/data|/v2/buildpacks/.*/bits) {
+      # Pass altered request body to this location
+      upload_pass   @cc_uploads;
+      upload_pass_args on;
+
+      # Store files to this directory
+      upload_store /var/vcap/data/cloud_controller_ng/tmp/uploads;
+
+      # No limit for output body forwarded to CC
+      upload_max_output_body_len 0;
+
+      # Allow uploaded files to be read only by #user
+      #upload_store_access #user:r;
+
+      # Set specified fields in request body
+      upload_set_form_field "${upload_field_name}_name" $upload_file_name;
+      upload_set_form_field "${upload_field_name}_path" $upload_tmp_path;
+
+      #forward the following fields from existing body
+      upload_pass_form_field "^resources$";
+      upload_pass_form_field "^_method$";
+
+      #on any error, delete uploaded files.
+      upload_cleanup 400-505;
+    }
+
+    location ~ /staging/(buildpack_cache|droplets)/.*/upload {
+
+      # Allow download the droplets and buildpacks
+      if ($request_method = GET){
+        proxy_pass http://cloud_controller;
+      }
+
+      # Pass along auth header
+      set $auth_header $upstream_http_x_auth;
+      proxy_set_header Authorization $auth_header;
+
+      # Pass altered request body to this location
+      upload_pass   @cc_uploads;
+
+      # Store files to this directory
+      upload_store /var/vcap/data/cloud_controller_ng/tmp/staged_droplet_uploads;
+
+      # Allow uploaded files to be read only by #user
+      upload_store_access user:r;
+
+      # Set specified fields in request body
+      upload_set_form_field "droplet_path" $upload_tmp_path;
+
+      #on any error, delete uploaded files.
+      upload_cleanup 400-505;
+    }
+
+    # Pass altered request body to a backend
+    location @cc_uploads {
+      proxy_pass http://unix:/var/vcap/sys/run/cloud_controller_ng/cloud_controller.sock;
+    }
+
+    location ~ ^/internal_redirect/(.*){
+      # only allow internal redirects
+      internal;
+
+      set $download_url $1;
+
+      #have to manualy pass along auth header
+      set $auth_header $upstream_http_x_auth;
+      proxy_set_header Authorization $auth_header;
+
+      # Download the file and send it to client
+      proxy_pass $download_url;
+    }
+  }
+}

=== added file 'tests/core/templates/test.conf'
--- tests/core/templates/test.conf	1970-01-01 00:00:00 +0000
+++ tests/core/templates/test.conf	2014-06-12 20:35:48 +0000
@@ -0,0 +1,3 @@
+something
+listen {{nginx_port}}
+something else

=== modified file 'tests/core/test_host.py'
--- tests/core/test_host.py	2014-06-02 13:31:55 +0000
+++ tests/core/test_host.py	2014-06-12 20:35:48 +0000
@@ -96,6 +96,16 @@
         service.assert_called_with('reload', service_name)
 
     @patch.object(host, 'service')
+    def test_service_available(self, service):
+        service_name = 'foo-service'
+        service.side_effect = [True]
+        self.assertTrue(host.service_available(service_name))
+        service.side_effect = [False]
+        self.assertFalse(host.service_available(service_name))
+
+        service.assert_called_with('status', service_name)
+
+    @patch.object(host, 'service')
     def test_failed_reload_restarts_a_service(self, service):
         service_name = 'foo-service'
         service.side_effect = [False, True]
@@ -436,6 +446,50 @@
             os_.fchmod.assert_called_with(fileno, perms)
             mock_file.write.assert_called_with('what is {juju}')
 
+    @patch('pwd.getpwnam')
+    @patch('grp.getgrnam')
+    @patch.object(host, 'log')
+    @patch.object(host, 'shutil')
+    @patch.object(host, 'os')
+    def test_copies_content_to_a_file(self, os_, shutil_, log, getgrnam, getpwnam):
+        # Curly brackets here demonstrate that we are *not* rendering
+        # these strings with Python's string formatting. This is a
+        # change from the original behavior per Bug #1195634.
+        uid = 123
+        gid = 234
+        owner = 'some-user-{foo}'
+        group = 'some-group-{bar}'
+        src = '/some/path/{baz}'
+        dst = '/some/other/path/{qux}'
+        perms = 0644
+
+        getpwnam.return_value.pw_uid = uid
+        getgrnam.return_value.gr_gid = gid
+
+        host.copy_file(src, dst, owner=owner, group=group, perms=perms)
+
+        getpwnam.assert_called_with('some-user-{foo}')
+        getgrnam.assert_called_with('some-group-{bar}')
+        shutil_.copyfile.assert_called_with(src, dst)
+        os_.chown.assert_called_with(dst, uid, gid)
+        os_.chmod.assert_called_with(dst, perms)
+
+    @patch.object(host, 'log')
+    @patch.object(host, 'shutil')
+    @patch.object(host, 'os')
+    def test_copies_content_with_default(self, os_, shutil_, log):
+        uid = 0
+        gid = 0
+        src = '/some/path/{baz}'
+        dst = '/some/other/path/{qux}'
+        perms = 0444
+
+        host.copy_file(src, dst)
+
+        shutil_.copyfile.assert_called_with(src, dst)
+        os_.chown.assert_called_with(dst, uid, gid)
+        os_.chmod.assert_called_with(dst, perms)
+
     @patch('subprocess.check_output')
     @patch.object(host, 'log')
     def test_mounts_a_device(self, log, check_output):
@@ -747,3 +801,57 @@
         self.assertEqual(host.cmp_pkgrevno('python', '2.3'), 1)
         self.assertEqual(host.cmp_pkgrevno('python', '2.4'), 0)
         self.assertEqual(host.cmp_pkgrevno('python', '2.5'), -1)
+
+    @patch('os.getcwd')
+    @patch('os.chdir')
+    def test_chdir(self, chdir, getcwd):
+        getcwd.side_effect = ['quw', 'qar']
+        try:
+            with host.chdir('foo'):
+                pass
+            with host.chdir('bar'):
+                raise ValueError()
+        except ValueError:
+            pass
+        else:
+            assert False, 'Expected ValueError'
+        self.assertEqual(chdir.call_args_list, [
+            call('foo'), call('quw'),
+            call('bar'), call('qar'),
+        ])
+
+    @patch('grp.getgrnam')
+    @patch('pwd.getpwnam')
+    @patch('os.chown')
+    @patch('os.walk')
+    def test_chownr(self, walk, chown, getpwnam, getgrnam):
+        getpwnam.return_value.pw_uid = 1
+        getgrnam.return_value.gr_gid = 2
+        walk.return_value = [
+            ('root1', ['dir1', 'dir2'], ['file1', 'file2']),
+            ('root2', ['dir3'], ['file3']),
+        ]
+        host.chownr('path', 'owner', 'group')
+        getpwnam.assert_called_once_with('owner')
+        getgrnam.assert_called_once_with('group')
+        self.assertEqual(chown.call_args_list, [
+            call('root1/dir1', 1, 2),
+            call('root1/dir2', 1, 2),
+            call('root1/file1', 1, 2),
+            call('root1/file2', 1, 2),
+            call('root2/dir3', 1, 2),
+            call('root2/file3', 1, 2),
+        ])
+
+    @patch('grp.getgrnam')
+    @patch('pwd.getpwnam')
+    @patch('os.path.exists')
+    @patch('os.path.lexists')
+    @patch('os.chown')
+    @patch('os.walk')
+    def test_chownr_broken_symlink(self, walk, chown, lexists, exists, getpwnam, getgrnam):
+        walk.return_value = [('root', [], ['file'])]
+        lexists.return_value = True
+        exists.return_value = False
+        host.chownr('path', 'owner', 'group')
+        self.assertFalse(chown.called)

=== added file 'tests/core/test_services.py'
--- tests/core/test_services.py	1970-01-01 00:00:00 +0000
+++ tests/core/test_services.py	2014-06-12 20:35:48 +0000
@@ -0,0 +1,423 @@
+import mock
+import unittest
+from charmhelpers.core import services
+
+
+class TestServiceManager(unittest.TestCase):
+    def test_register(self):
+        manager = services.ServiceManager([
+            {'service': 'service1',
+             'foo': 'bar'},
+            {'service': 'service2',
+             'qux': 'baz'},
+        ])
+        self.assertEqual(manager.services, {
+            'service1': {'service': 'service1',
+                         'foo': 'bar'},
+            'service2': {'service': 'service2',
+                         'qux': 'baz'},
+        })
+
+    @mock.patch.object(services.ServiceManager, 'reconfigure_services')
+    @mock.patch.object(services.ServiceManager, 'stop_services')
+    @mock.patch.object(services, 'sys')
+    def test_manage_stop(self, msys, stop_services, reconfigure_services):
+        manager = services.ServiceManager()
+        msys.argv = ['charm_dir/hooks/stop']
+        manager.manage()
+        stop_services.assert_called_once_with()
+        assert not reconfigure_services.called
+
+    @mock.patch.object(services.ServiceManager, 'reconfigure_services')
+    @mock.patch.object(services.ServiceManager, 'stop_services')
+    @mock.patch.object(services, 'sys')
+    def test_manage_other(self, msys, stop_services, reconfigure_services):
+        manager = services.ServiceManager()
+        msys.argv = ['charm_dir/hooks/config-changed']
+        manager.manage()
+        assert not stop_services.called
+        reconfigure_services.assert_called_once_with()
+
+    @mock.patch.object(services.ServiceManager, 'save_ready')
+    @mock.patch.object(services.ServiceManager, 'fire_event')
+    @mock.patch.object(services.ServiceManager, 'is_ready')
+    def test_reconfigure_ready(self, is_ready, fire_event, save_ready):
+        manager = services.ServiceManager([
+            {'service': 'service1'}, {'service': 'service2'}])
+        is_ready.return_value = True
+        manager.reconfigure_services()
+        is_ready.assert_has_calls([
+            mock.call('service1'),
+            mock.call('service2'),
+        ], any_order=True)
+        fire_event.assert_has_calls([
+            mock.call('data_ready', 'service1'),
+            mock.call('start', 'service1', default=[
+                services.host.service_restart,
+                services.open_ports]),
+        ], any_order=False)
+        fire_event.assert_has_calls([
+            mock.call('data_ready', 'service2'),
+            mock.call('start', 'service2', default=[
+                services.host.service_restart,
+                services.open_ports]),
+        ], any_order=False)
+        save_ready.assert_has_calls([
+            mock.call('service1'),
+            mock.call('service2'),
+        ], any_order=True)
+
+    @mock.patch.object(services.ServiceManager, 'save_ready')
+    @mock.patch.object(services.ServiceManager, 'fire_event')
+    @mock.patch.object(services.ServiceManager, 'is_ready')
+    def test_reconfigure_ready_list(self, is_ready, fire_event, save_ready):
+        manager = services.ServiceManager([
+            {'service': 'service1'}, {'service': 'service2'}])
+        is_ready.return_value = True
+        manager.reconfigure_services('service3', 'service4')
+        self.assertEqual(is_ready.call_args_list, [
+            mock.call('service3'),
+            mock.call('service4'),
+        ])
+        self.assertEqual(fire_event.call_args_list, [
+            mock.call('data_ready', 'service3'),
+            mock.call('start', 'service3', default=[
+                services.host.service_restart,
+                services.open_ports]),
+            mock.call('data_ready', 'service4'),
+            mock.call('start', 'service4', default=[
+                services.host.service_restart,
+                services.open_ports]),
+        ])
+        self.assertEqual(save_ready.call_args_list, [
+            mock.call('service3'),
+            mock.call('service4'),
+        ])
+
+    @mock.patch.object(services.ServiceManager, 'save_lost')
+    @mock.patch.object(services.ServiceManager, 'fire_event')
+    @mock.patch.object(services.ServiceManager, 'was_ready')
+    @mock.patch.object(services.ServiceManager, 'is_ready')
+    def test_reconfigure_not_ready(self, is_ready, was_ready, fire_event, save_lost):
+        manager = services.ServiceManager([
+            {'service': 'service1'}, {'service': 'service2'}])
+        is_ready.return_value = False
+        was_ready.return_value = False
+        manager.reconfigure_services()
+        is_ready.assert_has_calls([
+            mock.call('service1'),
+            mock.call('service2'),
+        ], any_order=True)
+        fire_event.assert_has_calls([
+            mock.call('stop', 'service1', default=[
+                services.close_ports,
+                services.host.service_stop]),
+            mock.call('stop', 'service2', default=[
+                services.close_ports,
+                services.host.service_stop]),
+        ], any_order=True)
+        save_lost.assert_has_calls([
+            mock.call('service1'),
+            mock.call('service2'),
+        ], any_order=True)
+
+    @mock.patch.object(services.ServiceManager, 'save_lost')
+    @mock.patch.object(services.ServiceManager, 'fire_event')
+    @mock.patch.object(services.ServiceManager, 'was_ready')
+    @mock.patch.object(services.ServiceManager, 'is_ready')
+    def test_reconfigure_no_longer_ready(self, is_ready, was_ready, fire_event, save_lost):
+        manager = services.ServiceManager([
+            {'service': 'service1'}, {'service': 'service2'}])
+        is_ready.return_value = False
+        was_ready.return_value = True
+        manager.reconfigure_services()
+        is_ready.assert_has_calls([
+            mock.call('service1'),
+            mock.call('service2'),
+        ], any_order=True)
+        fire_event.assert_has_calls([
+            mock.call('data_lost', 'service1'),
+            mock.call('stop', 'service1', default=[
+                services.close_ports,
+                services.host.service_stop]),
+        ], any_order=False)
+        fire_event.assert_has_calls([
+            mock.call('data_lost', 'service2'),
+            mock.call('stop', 'service2', default=[
+                services.close_ports,
+                services.host.service_stop]),
+        ], any_order=False)
+        save_lost.assert_has_calls([
+            mock.call('service1'),
+            mock.call('service2'),
+        ], any_order=True)
+
+    @mock.patch.object(services.ServiceManager, 'fire_event')
+    def test_stop_services(self, fire_event):
+        manager = services.ServiceManager([
+            {'service': 'service1'}, {'service': 'service2'}])
+        manager.stop_services()
+        fire_event.assert_has_calls([
+            mock.call('stop', 'service1', default=[
+                services.close_ports,
+                services.host.service_stop]),
+            mock.call('stop', 'service2', default=[
+                services.close_ports,
+                services.host.service_stop]),
+        ], any_order=True)
+
+    @mock.patch.object(services.ServiceManager, 'fire_event')
+    def test_stop_services_list(self, fire_event):
+        manager = services.ServiceManager([
+            {'service': 'service1'}, {'service': 'service2'}])
+        manager.stop_services('service3', 'service4')
+        self.assertEqual(fire_event.call_args_list, [
+            mock.call('stop', 'service3', default=[
+                services.close_ports,
+                services.host.service_stop]),
+            mock.call('stop', 'service4', default=[
+                services.close_ports,
+                services.host.service_stop]),
+        ])
+
+    def test_get_service(self):
+        service = {'service': 'test', 'test': 'test_service'}
+        manager = services.ServiceManager([service])
+        self.assertEqual(manager.get_service('test'), service)
+
+    def test_get_service_not_registered(self):
+        service = {'service': 'test', 'test': 'test_service'}
+        manager = services.ServiceManager([service])
+        self.assertRaises(KeyError, manager.get_service, 'foo')
+
+    @mock.patch.object(services.ServiceManager, 'get_service')
+    def test_fire_event_default(self, get_service):
+        get_service.return_value = {}
+        cb = mock.Mock()
+        manager = services.ServiceManager()
+        manager.fire_event('event', 'service', cb)
+        cb.assert_called_once_with('service')
+
+    @mock.patch.object(services.ServiceManager, 'get_service')
+    def test_fire_event_default_list(self, get_service):
+        get_service.return_value = {}
+        cb = mock.Mock()
+        manager = services.ServiceManager()
+        manager.fire_event('event', 'service', [cb])
+        cb.assert_called_once_with('service')
+
+    @mock.patch.object(services.ServiceManager, 'get_service')
+    def test_fire_event_simple_callback(self, get_service):
+        cb = mock.Mock()
+        dcb = mock.Mock()
+        get_service.return_value = {'event': cb}
+        manager = services.ServiceManager()
+        manager.fire_event('event', 'service', dcb)
+        assert not dcb.called
+        cb.assert_called_once_with('service')
+
+    @mock.patch.object(services.ServiceManager, 'get_service')
+    def test_fire_event_simple_callback_list(self, get_service):
+        cb = mock.Mock()
+        dcb = mock.Mock()
+        get_service.return_value = {'event': [cb]}
+        manager = services.ServiceManager()
+        manager.fire_event('event', 'service', dcb)
+        assert not dcb.called
+        cb.assert_called_once_with('service')
+
+    @mock.patch.object(services.ManagerCallback, '__call__')
+    @mock.patch.object(services.ServiceManager, 'get_service')
+    def test_fire_event_manager_callback(self, get_service, mcall):
+        cb = services.ManagerCallback()
+        dcb = mock.Mock()
+        get_service.return_value = {'event': cb}
+        manager = services.ServiceManager()
+        manager.fire_event('event', 'service', dcb)
+        assert not dcb.called
+        mcall.assert_called_once_with(manager, 'service', 'event')
+
+    @mock.patch.object(services.ManagerCallback, '__call__')
+    @mock.patch.object(services.ServiceManager, 'get_service')
+    def test_fire_event_manager_callback_list(self, get_service, mcall):
+        cb = services.ManagerCallback()
+        dcb = mock.Mock()
+        get_service.return_value = {'event': [cb]}
+        manager = services.ServiceManager()
+        manager.fire_event('event', 'service', dcb)
+        assert not dcb.called
+        mcall.assert_called_once_with(manager, 'service', 'event')
+
+    @mock.patch.object(services.ServiceManager, 'get_service')
+    def test_is_ready(self, get_service):
+        get_service.side_effect = [
+            {},
+            {'required_data': [True]},
+            {'required_data': [False]},
+            {'required_data': [True, False]},
+        ]
+        manager = services.ServiceManager()
+        assert manager.is_ready('foo')
+        assert manager.is_ready('bar')
+        assert not manager.is_ready('foo')
+        assert not manager.is_ready('foo')
+        get_service.assert_has_calls([mock.call('foo'), mock.call('bar')])
+
+    @mock.patch.object(services.hookenv, 'charm_dir')
+    @mock.patch.object(services, 'open', create=True)
+    def test_save_ready(self, mopen, charm_dir):
+        charm_dir.return_value = 'charm_dir'
+        manager = services.ServiceManager()
+        manager.save_ready('foo')
+        mopen.assert_called_once_with('charm_dir/.ready.foo', 'a')
+
+    @mock.patch.object(services.hookenv, 'charm_dir')
+    @mock.patch('os.remove')
+    @mock.patch('os.path.exists')
+    def test_save_lost(self, exists, remove, charm_dir):
+        charm_dir.return_value = 'charm_dir'
+        manager = services.ServiceManager()
+        manager.save_lost('foo')
+        exists.assert_called_once_with('charm_dir/.ready.foo')
+        remove.assert_called_once_with('charm_dir/.ready.foo')
+
+    @mock.patch.object(services.hookenv, 'charm_dir')
+    @mock.patch('os.path.exists')
+    def test_was_ready(self, exists, charm_dir):
+        charm_dir.return_value = 'charm_dir'
+        manager = services.ServiceManager()
+        manager.was_ready('foo')
+        exists.assert_called_once_with('charm_dir/.ready.foo')
+
+
+class TestRelationContext(unittest.TestCase):
+    def setUp(self):
+        with mock.patch.object(services, 'hookenv') as mhookenv:
+            mhookenv.relation_ids.return_value = []
+            self.context = services.RelationContext()
+        self.context.interface = 'http'
+        self.context.required_keys = ['foo', 'bar']
+
+    @mock.patch.object(services, 'hookenv')
+    def test_no_relations(self, mhookenv):
+        mhookenv.relation_ids.return_value = []
+        self.context.get_data()
+        self.assertFalse(self.context.is_ready())
+        self.assertEqual(self.context, {})
+        mhookenv.relation_ids.assert_called_once_with('http')
+
+    @mock.patch.object(services, 'hookenv')
+    def test_no_units(self, mhookenv):
+        mhookenv.relation_ids.return_value = ['nginx']
+        mhookenv.related_units.return_value = []
+        self.context.get_data()
+        self.assertFalse(self.context.is_ready())
+        self.assertEqual(self.context, {'http': []})
+
+    @mock.patch.object(services, 'hookenv')
+    def test_incomplete(self, mhookenv):
+        mhookenv.relation_ids.return_value = ['nginx', 'apache']
+        mhookenv.related_units.side_effect = lambda i: [i+'/0']
+        mhookenv.relation_get.side_effect = [{}, {'foo': '1'}]
+        self.context.get_data()
+        self.assertFalse(bool(self.context))
+        self.assertEqual(mhookenv.relation_get.call_args_list, [
+            mock.call(rid='apache', unit='apache/0'),
+            mock.call(rid='nginx', unit='nginx/0'),
+        ])
+
+    @mock.patch.object(services, 'hookenv')
+    def test_complete(self, mhookenv):
+        mhookenv.relation_ids.return_value = ['nginx', 'apache', 'tomcat']
+        mhookenv.related_units.side_effect = lambda i: [i+'/0']
+        mhookenv.relation_get.side_effect = [{'foo': '1'}, {'foo': '2', 'bar': '3'}, {}]
+        self.context.get_data()
+        self.assertTrue(self.context.is_ready())
+        self.assertEqual(self.context, {'http': [
+            {
+                'foo': '2',
+                'bar': '3',
+            },
+        ]})
+        mhookenv.relation_ids.assert_called_with('http')
+        self.assertEqual(mhookenv.relation_get.call_args_list, [
+            mock.call(rid='apache', unit='apache/0'),
+            mock.call(rid='nginx', unit='nginx/0'),
+            mock.call(rid='tomcat', unit='tomcat/0'),
+        ])
+
+
+class TestTemplateCallback(unittest.TestCase):
+    @mock.patch.object(services, 'templating')
+    def test_template_defaults(self, mtemplating):
+        manager = mock.Mock(**{'get_service.return_value': {
+            'required_data': [{'foo': 'bar'}]}})
+        self.assertRaises(TypeError, services.template, source='foo.yml')
+        callback = services.template(source='foo.yml', target='bar.yml')
+        assert isinstance(callback, services.ManagerCallback)
+        assert not mtemplating.render.called
+        callback(manager, 'test', 'event')
+        mtemplating.render.assert_called_once_with(
+            'foo.yml', 'bar.yml', {'foo': 'bar'},
+            'root', 'root', 0444)
+
+    @mock.patch.object(services, 'templating')
+    def test_template_explicit(self, mtemplating):
+        manager = mock.Mock(**{'get_service.return_value': {
+            'required_data': [{'foo': 'bar'}]}})
+        callback = services.template(
+            source='foo.yml', target='bar.yml',
+            owner='user', group='group', perms=0555
+        )
+        assert isinstance(callback, services.ManagerCallback)
+        assert not mtemplating.render.called
+        callback(manager, 'test', 'event')
+        mtemplating.render.assert_called_once_with(
+            'foo.yml', 'bar.yml', {'foo': 'bar'},
+            'user', 'group', 0555)
+
+
+class TestPortsCallback(unittest.TestCase):
+    @mock.patch.object(services, 'open', create=True)
+    @mock.patch.object(services, 'hookenv')
+    def test_no_ports(self, hookenv, mopen):
+        hookenv.charm_dir.return_value = 'charm_dir'
+        manager = mock.Mock(**{'get_service.return_value': {}})
+        services.PortManagerCallback()(manager, 'service', 'event')
+        assert not hookenv.open_port.called
+        assert not hookenv.close_port.called
+
+    @mock.patch.object(services, 'open', create=True)
+    @mock.patch.object(services, 'hookenv')
+    def test_open_ports(self, hookenv, mopen):
+        hookenv.charm_dir.return_value = 'charm_dir'
+        manager = mock.Mock(**{'get_service.return_value': {'ports': [1, 2]}})
+        services.open_ports(manager, 'service', 'start')
+        hookenv.open_port.has_calls([mock.call(1), mock.call(2)])
+        assert not hookenv.close_port.called
+
+    @mock.patch.object(services, 'open', create=True)
+    @mock.patch.object(services, 'hookenv')
+    def test_close_ports(self, hookenv, mopen):
+        hookenv.charm_dir.return_value = 'charm_dir'
+        manager = mock.Mock(**{'get_service.return_value': {'ports': [1, 2]}})
+        services.close_ports(manager, 'service', 'stop')
+        assert not hookenv.open_port.called
+        hookenv.close_port.has_calls([mock.call(1), mock.call(2)])
+
+    @mock.patch.object(services, 'open', create=True)
+    @mock.patch.object(services, 'hookenv')
+    def test_close_old_ports(self, hookenv, mopen):
+        hookenv.charm_dir.return_value = 'charm_dir'
+        mopen.return_value.read.return_value = '10,20'
+        manager = mock.Mock(**{'get_service.return_value': {'ports': [1, 2]}})
+        services.close_ports(manager, 'service', 'stop')
+        assert not hookenv.open_port.called
+        hookenv.close_port.has_calls([
+            mock.call(10),
+            mock.call(20),
+            mock.call(1),
+            mock.call(2)])
+
+if __name__ == '__main__':
+    unittest.main()

=== added file 'tests/core/test_templating.py'
--- tests/core/test_templating.py	1970-01-01 00:00:00 +0000
+++ tests/core/test_templating.py	2014-06-12 20:35:48 +0000
@@ -0,0 +1,58 @@
+import pkg_resources
+import tempfile
+import unittest
+import jinja2
+
+import mock
+from charmhelpers.core import templating
+
+
+TEMPLATES_DIR = pkg_resources.resource_filename(__name__, 'templates')
+
+
+class TestTemplating(unittest.TestCase):
+    def setUp(self):
+        self.charm_dir = pkg_resources.resource_filename(__name__, '')
+        self._charm_dir_patch = mock.patch.object(templating.hookenv, 'charm_dir')
+        self._charm_dir_mock = self._charm_dir_patch.start()
+        self._charm_dir_mock.side_effect = lambda: self.charm_dir
+
+    def tearDown(self):
+        self._charm_dir_patch.stop()
+
+    @mock.patch.object(templating.host.os, 'fchown')
+    @mock.patch.object(templating.host, 'mkdir')
+    @mock.patch.object(templating.host, 'log')
+    def test_render(self, log, mkdir, fchown):
+        _, fn1 = tempfile.mkstemp()
+        _, fn2 = tempfile.mkstemp()
+        context = {
+            'nats': {
+                'nats_port': '1234',
+                'nats_host': 'example.com',
+            },
+            'router': {
+                'domain': 'api.foo.com'
+            },
+            'nginx_port': 80,
+        }
+        templating.render('fake_cc.yml', fn1, context, templates_dir=TEMPLATES_DIR)
+        contents = open(fn1).read()
+        self.assertRegexpMatches(contents, 'port: 1234')
+        self.assertRegexpMatches(contents, 'host: example.com')
+        self.assertRegexpMatches(contents, 'domain: api.foo.com')
+
+        templating.render('test.conf', fn2, context, templates_dir=TEMPLATES_DIR)
+        contents = open(fn2).read()
+        self.assertRegexpMatches(contents, 'listen 80')
+        self.assertEqual(fchown.call_count, 2)
+        self.assertEqual(mkdir.call_count, 2)
+
+    @mock.patch.object(templating, 'hookenv')
+    @mock.patch('jinja2.Environment')
+    def test_load_error(self, Env, hookenv):
+        Env().get_template.side_effect = jinja2.exceptions.TemplateNotFound('fake_cc.yml')
+        self.assertRaises(
+            jinja2.exceptions.TemplateNotFound, templating.render,
+            'fake.src', 'fake.tgt', {}, templates_dir='tmpl')
+        hookenv.log.assert_called_once_with('Could not load template fake.src from tmpl.', level=hookenv.ERROR)

=== modified file 'tests/tools/test_charm_helper_sync.py'
--- tests/tools/test_charm_helper_sync.py	2013-07-12 01:28:42 +0000
+++ tests/tools/test_charm_helper_sync.py	2014-06-12 20:35:48 +0000
@@ -2,7 +2,7 @@
 from mock import call, patch
 import yaml
 
-import tools.charm_helpers_sync.charm_helpers_sync as sync
+import charmhelpers.chsync.charm_helpers_sync as sync
 
 INCLUDE = """
 include:
@@ -67,7 +67,7 @@
         for c in ex:
             self.assertIn(c, _open.call_args_list)
 
-    @patch('tools.charm_helpers_sync.charm_helpers_sync.ensure_init')
+    @patch('charmhelpers.chsync.charm_helpers_sync.ensure_init')
     @patch('os.path.isfile')
     @patch('shutil.copy')
     @patch('os.makedirs')
@@ -139,8 +139,8 @@
         '''It does not filter anything if option specified to include all'''
         self.assertEquals(sync.get_filter(opts=['inc=*']), None)
 
-    @patch('tools.charm_helpers_sync.charm_helpers_sync.get_filter')
-    @patch('tools.charm_helpers_sync.charm_helpers_sync.ensure_init')
+    @patch('charmhelpers.chsync.charm_helpers_sync.get_filter')
+    @patch('charmhelpers.chsync.charm_helpers_sync.ensure_init')
     @patch('shutil.copytree')
     @patch('shutil.rmtree')
     @patch('os.path.exists')
@@ -164,7 +164,7 @@
             '/tmp/charm-helpers/charmhelpers/core/host.py'
         )
 
-    @patch('tools.charm_helpers_sync.charm_helpers_sync.sync_directory')
+    @patch('charmhelpers.chsync.charm_helpers_sync.sync_directory')
     @patch('os.path.isdir')
     def test_syncs_directory(self, is_dir, sync_dir):
         '''It correctly syncs a module directory'''
@@ -177,8 +177,8 @@
             '/tmp/charm-helpers/charmhelpers/contrib/openstack',
             'hooks/charmhelpers/contrib/openstack', None)
 
-    @patch('tools.charm_helpers_sync.charm_helpers_sync.sync_pyfile')
-    @patch('tools.charm_helpers_sync.charm_helpers_sync._is_pyfile')
+    @patch('charmhelpers.chsync.charm_helpers_sync.sync_pyfile')
+    @patch('charmhelpers.chsync.charm_helpers_sync._is_pyfile')
     @patch('os.path.isdir')
     def test_syncs_file(self, is_dir, is_pyfile, sync_pyfile):
         '''It correctly syncs a module file'''
@@ -192,7 +192,7 @@
             '/tmp/charm-helpers/charmhelpers/contrib/openstack/utils',
             'hooks/charmhelpers/contrib/openstack')
 
-    @patch('tools.charm_helpers_sync.charm_helpers_sync.sync')
+    @patch('charmhelpers.chsync.charm_helpers_sync.sync')
     @patch('os.path.isdir')
     def test_sync_helpers_from_config(self, isdir, _sync):
         '''It correctly syncs a list of included helpers'''
@@ -247,3 +247,4 @@
         ex = ('contrib.openstack.templates',
               ['inc=*.template', 'inc=foo.*', 'inc=*.cfg'])
         self.assertEquals(ex, result)
+

=== removed directory 'tools'
=== removed file 'tools/__init__.py'
=== removed directory 'tools/charm_helpers_sync'
=== removed file 'tools/charm_helpers_sync/README'
--- tools/charm_helpers_sync/README	2013-07-12 01:28:42 +0000
+++ tools/charm_helpers_sync/README	1970-01-01 00:00:00 +0000
@@ -1,114 +0,0 @@
-Script for synchronizing charm-helpers into a charm branch.
-
-This script is intended to be used by charm authors during the development
-of their charm.  It allows authors to pull in bits of a charm-helpers source
-tree and embed directly into their charm, to be deployed with the rest of
-their hooks and charm payload.  This script is not intended to be called
-by the hooks themselves, but instead by the charm author while they are
-hacking on a charm offline.  Consider it a method of compiling specific
-revision of a charm-helpers branch into a given charm source tree.
-
-Some goals and benefits to using a sync tool to manage this process:
-
-    - Reduces the burden of manually copying in upstream charm helpers code
-      into a charm and helps ensure we can easily keep a specific charm's
-      helper code up to date.
-
-    - Allows authors to hack on their own working branch of charm-helpers,
-      easily sync into their WIP charm.  Any changes they've made to charm
-      helpers can be upstreamed via a merge of their charm-helpers branch
-      into lp:charm-helpers, ideally at the same time they are upstreaming
-      the charm itself into the charm store.  Separating charm helper
-      development from charm development can help reduce cases where charms
-      are shipping locally modified helpers.
-
-    - Avoids the need to ship the *entire* lp:charm-helpers source tree with
-      a charm.  Authors can selectively pick and choose what subset of helpers
-      to include to satisfy the goals of their charm.
-
-Loosely based on OpenStack's oslo-incubator:
-
-    https://github.com/openstack/oslo-incubator.git
-
-Allows specifying a list of dependencies to sync in from a charm-helpers
-branch.  Ideally, each charm should describe its requirements in a yaml
-config included in the charm, eg charm-helpers.yaml (NOTE: Example module
-layout as of 05/30/2013):
-
-    $ cd my-charm
-    $ cat >charm-helpers.yaml <<END
-    destination: hooks/helpers
-    branch: lp:charm-helpers
-    include:
-        - core
-        - contrib.openstack
-        - contrib.hahelpers:
-            - ceph_utils
-    END
-
-includes may be defined as entire module sub-directories, or as invidual
-.py files with in a module sub-directory.
-
-Charm author can then sync in and update helpers as needed.  The following
-import all of charmhelpers.core + charmhelpers.contrib.openstack, and only
-ceph_utils.py from charmhelpers.contrib.hahelpers:
-
-    $ charm-helper-sync -c charm-helpers.yaml
-    $ find hooks/helpers/
-    hooks/helpers/
-    hooks/helpers/contrib
-    hooks/helpers/contrib/openstack
-    hooks/helpers/contrib/openstack/openstack_utils.py
-    hooks/helpers/contrib/openstack/__init__.py
-    hooks/helpers/contrib/hahelpers
-    hooks/helpers/contrib/hahelpers/ceph_utils.py
-    hooks/helpers/contrib/hahelpers/__init__.py
-    hooks/helpers/contrib/__init__.py
-    hooks/helpers/core
-    hooks/helpers/core/hookenv.py
-    hooks/helpers/core/host.py
-    hooks/helpers/core/__init__.py
-    hooks/helpers/__init__.py
-
-
-Script will create missing __init__.py's to ensure each subdirectory is
-importable, assuming the script is run from the charm's top-level directory.
-
-By default, only directories that look like python modules and associated
-.py source files will be synced.  If you need to include other files in
-the sync (for example, template files), you can add include hints to
-your config.  This can be done either on a per-module basis using standard
-UNIX filename patterns, eg:
-
-    destination: hooks/helpers
-    branch: lp:charm-helpers
-    include:
-        - core|inc=* # include all extra files from this module.
-        - contrib.openstack|inc=*.template # include only .template's
-        - contrib.hahelpers:
-            - ceph_utils|inc=*.cfg # include .cfg files
-
-Or globally for all included assets:
-
-    destination: hooks/helpers
-    branch: lp:charm-helpers
-    options: inc=*.template,inc=*.cfg # include .templates and .cfgs globally
-    include:
-        - core
-        - contrib.openstack
-        - contrib.hahelpers:
-            - ceph_utils
-
-You may also override configured destination directory and source bzr
-branch:
-
-    $ charm-helper-sync -b ~/src/bzr/charm-helpers-dev \
-            -d hooks/helpers-test \
-            -c charm-helpers.yaml
-
-Or not use a config file at all:
-    $ charm-helper-sync -b lp:~gandelman-a/charm-helpers/fixes \
-            -d hooks/helpers core contrib.openstack contrib.hahelpers
-
-Script will create missing __init__.py's to ensure each subdirectory is
-importable, assuming the script is run from the charm's top-level directory.

=== removed file 'tools/charm_helpers_sync/__init__.py'
=== removed file 'tools/charm_helpers_sync/charm_helpers_sync.py'
--- tools/charm_helpers_sync/charm_helpers_sync.py	2013-07-12 01:28:42 +0000
+++ tools/charm_helpers_sync/charm_helpers_sync.py	1970-01-01 00:00:00 +0000
@@ -1,225 +0,0 @@
-#!/usr/bin/python
-#
-# Copyright 2013 Canonical Ltd.
-
-# Authors:
-#   Adam Gandelman <adamg@xxxxxxxxxx>
-#
-
-import logging
-import optparse
-import os
-import subprocess
-import shutil
-import sys
-import tempfile
-import yaml
-
-from fnmatch import fnmatch
-
-CHARM_HELPERS_BRANCH = 'lp:charm-helpers'
-
-
-def parse_config(conf_file):
-    if not os.path.isfile(conf_file):
-        logging.error('Invalid config file: %s.' % conf_file)
-        return False
-    return yaml.load(open(conf_file).read())
-
-
-def clone_helpers(work_dir, branch):
-    dest = os.path.join(work_dir, 'charm-helpers')
-    logging.info('Checking out %s to %s.' % (branch, dest))
-    cmd = ['bzr', 'branch', branch, dest]
-    subprocess.check_call(cmd)
-    return dest
-
-
-def _module_path(module):
-    return os.path.join(*module.split('.'))
-
-
-def _src_path(src, module):
-    return os.path.join(src, 'charmhelpers', _module_path(module))
-
-
-def _dest_path(dest, module):
-    return os.path.join(dest, _module_path(module))
-
-
-def _is_pyfile(path):
-    return os.path.isfile(path + '.py')
-
-
-def ensure_init(path):
-    '''
-    ensure directories leading up to path are importable, omitting
-    parent directory, eg path='/hooks/helpers/foo'/:
-        hooks/
-        hooks/helpers/__init__.py
-        hooks/helpers/foo/__init__.py
-    '''
-    for d, dirs, files in os.walk(os.path.join(*path.split('/')[:2])):
-        _i = os.path.join(d, '__init__.py')
-        if not os.path.exists(_i):
-            logging.info('Adding missing __init__.py: %s' % _i)
-            open(_i, 'wb').close()
-
-
-def sync_pyfile(src, dest):
-    src = src + '.py'
-    src_dir = os.path.dirname(src)
-    logging.info('Syncing pyfile: %s -> %s.' % (src, dest))
-    if not os.path.exists(dest):
-        os.makedirs(dest)
-    shutil.copy(src, dest)
-    if os.path.isfile(os.path.join(src_dir, '__init__.py')):
-        shutil.copy(os.path.join(src_dir, '__init__.py'),
-                    dest)
-    ensure_init(dest)
-
-
-def get_filter(opts=None):
-    opts = opts or []
-    if 'inc=*' in opts:
-        # do not filter any files, include everything
-        return None
-
-    def _filter(dir, ls):
-        incs = [opt.split('=').pop() for opt in opts if 'inc=' in opt]
-        _filter = []
-        for f in ls:
-            _f = os.path.join(dir, f)
-
-            if not os.path.isdir(_f) and not _f.endswith('.py') and incs:
-                if True not in [fnmatch(_f, inc) for inc in incs]:
-                    logging.debug('Not syncing %s, does not match include '
-                                  'filters (%s)' % (_f, incs))
-                    _filter.append(f)
-                else:
-                    logging.debug('Including file, which matches include '
-                                  'filters (%s): %s' % (incs, _f))
-            elif (os.path.isfile(_f) and not _f.endswith('.py')):
-                logging.debug('Not syncing file: %s' % f)
-                _filter.append(f)
-            elif (os.path.isdir(_f) and not
-                  os.path.isfile(os.path.join(_f, '__init__.py'))):
-                logging.debug('Not syncing directory: %s' % f)
-                _filter.append(f)
-        return _filter
-    return _filter
-
-
-def sync_directory(src, dest, opts=None):
-    if os.path.exists(dest):
-        logging.debug('Removing existing directory: %s' % dest)
-        shutil.rmtree(dest)
-    logging.info('Syncing directory: %s -> %s.' % (src, dest))
-
-    shutil.copytree(src, dest, ignore=get_filter(opts))
-    ensure_init(dest)
-
-
-def sync(src, dest, module, opts=None):
-    if os.path.isdir(_src_path(src, module)):
-        sync_directory(_src_path(src, module), _dest_path(dest, module), opts)
-    elif _is_pyfile(_src_path(src, module)):
-        sync_pyfile(_src_path(src, module),
-                    os.path.dirname(_dest_path(dest, module)))
-    else:
-        logging.warn('Could not sync: %s. Neither a pyfile or directory, '
-                     'does it even exist?' % module)
-
-
-def parse_sync_options(options):
-    if not options:
-        return []
-    return options.split(',')
-
-
-def extract_options(inc, global_options=None):
-    global_options = global_options or []
-    if global_options and isinstance(global_options, basestring):
-        global_options = [global_options]
-    if '|' not in inc:
-        return (inc, global_options)
-    inc, opts = inc.split('|')
-    return (inc, parse_sync_options(opts) + global_options)
-
-
-def sync_helpers(include, src, dest, options=None):
-    if not os.path.isdir(dest):
-        os.mkdir(dest)
-
-    global_options = parse_sync_options(options)
-
-    for inc in include:
-        if isinstance(inc, str):
-            inc, opts = extract_options(inc, global_options)
-            sync(src, dest, inc, opts)
-        elif isinstance(inc, dict):
-            # could also do nested dicts here.
-            for k, v in inc.iteritems():
-                if isinstance(v, list):
-                    for m in v:
-                        inc, opts = extract_options(m, global_options)
-                        sync(src, dest, '%s.%s' % (k, inc), opts)
-
-if __name__ == '__main__':
-    parser = optparse.OptionParser()
-    parser.add_option('-c', '--config', action='store', dest='config',
-                      default=None, help='helper config file')
-    parser.add_option('-D', '--debug', action='store_true', dest='debug',
-                      default=False, help='debug')
-    parser.add_option('-b', '--branch', action='store', dest='branch',
-                      help='charm-helpers bzr branch (overrides config)')
-    parser.add_option('-d', '--destination', action='store', dest='dest_dir',
-                      help='sync destination dir (overrides config)')
-    (opts, args) = parser.parse_args()
-
-    if opts.debug:
-        logging.basicConfig(level=logging.DEBUG)
-    else:
-        logging.basicConfig(level=logging.INFO)
-
-    if opts.config:
-        logging.info('Loading charm helper config from %s.' % opts.config)
-        config = parse_config(opts.config)
-        if not config:
-            logging.error('Could not parse config from %s.' % opts.config)
-            sys.exit(1)
-    else:
-        config = {}
-
-    if 'branch' not in config:
-        config['branch'] = CHARM_HELPERS_BRANCH
-    if opts.branch:
-        config['branch'] = opts.branch
-    if opts.dest_dir:
-        config['destination'] = opts.dest_dir
-
-    if 'destination' not in config:
-        logging.error('No destination dir. specified as option or config.')
-        sys.exit(1)
-
-    if 'include' not in config:
-        if not args:
-            logging.error('No modules to sync specified as option or config.')
-            sys.exit(1)
-        config['include'] = []
-        [config['include'].append(a) for a in args]
-
-    sync_options = None
-    if 'options' in config:
-        sync_options = config['options']
-    tmpd = tempfile.mkdtemp()
-    try:
-        checkout = clone_helpers(tmpd, config['branch'])
-        sync_helpers(config['include'], checkout, config['destination'],
-                     options=sync_options)
-    except Exception, e:
-        logging.error("Could not sync: %s" % e)
-        raise e
-    finally:
-        logging.debug('Cleaning up %s' % tmpd)
-        shutil.rmtree(tmpd)

=== removed file 'tools/charm_helpers_sync/example-config.yaml'
--- tools/charm_helpers_sync/example-config.yaml	2013-06-20 00:45:16 +0000
+++ tools/charm_helpers_sync/example-config.yaml	1970-01-01 00:00:00 +0000
@@ -1,14 +0,0 @@
-# Import from remote bzr branch.
-branch: lp:charm-helpers
-# install helpers to ./hooks/charmhelpers
-destination: hooks/charmhelpers
-include:
-    # include all of charmhelpers.core
-    - core
-    # all of charmhelpers.payload
-    - payload
-    # and a subset of charmhelpers.contrib.hahelpers
-    - contrib.hahelpers:
-        - openstack_common
-        - ceph_utils
-        - utils

=== added file 'tox.ini'
--- tox.ini	1970-01-01 00:00:00 +0000
+++ tox.ini	2014-06-12 20:35:48 +0000
@@ -0,0 +1,14 @@
+[tox]
+envlist = py27
+
+[testenv]
+install_command=pip install --pre {opts}
+   --allow-all-external
+   --allow-unverified launchpadlib
+   --allow-unverified python-apt
+   --allow-unverified bzr
+   --allow-unverified lazr.authentication
+   {packages}
+deps=-r{toxinidir}/test-requirements-tox.txt
+commands =
+         nosetests --nologcapture {posargs}


Follow ups