← Back to team overview

launchpad-reviewers team mailing list archive

[Merge] lp:~jtv/maas/download-commissioning-script into lp:maas

 

Jeroen T. Vermeulen has proposed merging lp:~jtv/maas/download-commissioning-script into lp:maas.

Commit message:
Make commissioning nodes download their commissioning scripts (such as lshw) from the metadata service.  And componentize the big commissioning script.

Requested reviews:
  Launchpad code reviewers (launchpad-reviewers)

For more details, see:
https://code.launchpad.net/~jtv/maas/download-commissioning-script/+merge/140422

This is a huge branch.  The work was done in collaboration with Raphael, who worked his arse off in the server sewers to get this tested and debugged.

The metadata service provides a tarball of commissioning scripts: the lshw script, plus any owner-provided commissioning scripts.  In this branch the main commissioning script learns to download and unpack that tarball, instead of writing the scripts as "here documents."

Ironically, those "here documents" are also needed and they got out of hand.  During development and testing we hit some problems that "make lint" would have detected if only the python weren't embedded in shell script; other problems would have stood out in unit tests, had the code been organized in a testable way; and yet other problems really did require manual testing.

And so this branch also splits up the big commissioning script, generating it from a tempita template (src/metadataserver/commissioning/user_data.template) and a directory of snippets (src/metadataserver/commissioning/snippets/*) for substitution into the template.  It normalizes the way we maintain this well-hidden python code.

Do not confuse these snippets with what's in the commissioning tarball: the snippets are just components that are needed to enable the download of that tarball.

Only one snippet is properly new code: maas_get.py.  I extracted a good portion of the old maas-signal into maas_api_helper.py, where maas_get.py could then re-use it.

Two snippets (maas_get.py and maas_signal.py) have different filenames in the codebase than they end up getting on the node (maas-get and maas-signal, respectively).  On the one hand, changing their names on the node might have upset existing habits, documentation, and scripting.  On the other hand, their names in the source tree had to make clear to our code maintenance tools that they were python.  The templating makes it easy to keep the two filenames separate.

You may wonder why I left the IPMI code and its configs in the template, instead of splitting those out as well.  The reason (besides this branch already being big enough!) is that those should probably become part of the downloaded tarball, not remain in the componentized main commissioning script.  The change would be easy, and make a lot of sense.  It would also reduce the size of the main commissioning script: I seem to remember hearing it should be no more than 16KB(*) and it's more than that already.

Jeroen

(*) That is Kilobytes, not Kibibytes.  There is a lot of confusion over this, because many people are unaware that a proper imperial byte consists of 8.192 bits.  They generally prefer to use the simpler, 8-bit "metric byte" instead.  A kilobyte is exactly 1,000 imperial 8.192-bit bytes, just as a kilometer is exactly 1,000 meters.  But due to the rounding error in conversion to metric bytes, a kilobyte needs 24 "leap bytes."  This raises yet more confusion over whether these leap bytes are metric or imperial bytes, and that is why there is so much miscommunication over gigabytes and terabytes in the hard-drive industry.  Are we all clear now?
-- 
https://code.launchpad.net/~jtv/maas/download-commissioning-script/+merge/140422
Your team Launchpad code reviewers is requested to review the proposed merge of lp:~jtv/maas/download-commissioning-script into lp:maas.
=== modified file 'src/maas/development.py'
--- src/maas/development.py	2012-09-19 13:48:42 +0000
+++ src/maas/development.py	2012-12-18 13:00:43 +0000
@@ -100,10 +100,6 @@
 DEV_ROOT_DIRECTORY = os.path.join(
     os.path.dirname(__file__), os.pardir, os.pardir)
 
-
-COMMISSIONING_SCRIPT = os.path.join(
-    DEV_ROOT_DIRECTORY, 'etc/maas/commissioning-user-data')
-
 # Override the default provisioning config filename.
 provisioningserver.config.Config.DEFAULT_FILENAME = abspath("etc/pserv.yaml")
 

=== modified file 'src/maas/settings.py'
--- src/maas/settings.py	2012-11-05 10:59:38 +0000
+++ src/maas/settings.py	2012-12-18 13:00:43 +0000
@@ -284,11 +284,6 @@
     'version': 1,
 }
 
-# The location of the commissioning script that is executed on nodes as
-# part of commissioning.  Only override this if you know what you are
-# doing.
-COMMISSIONING_SCRIPT = '/etc/maas/commissioning-user-data'
-
 # The duration, in minutes, after which we consider a commissioning node
 # to have failed and mark it as FAILED_TESTS.
 COMMISSIONING_TIMEOUT = 60

=== modified file 'src/maasserver/models/node.py'
--- src/maasserver/models/node.py	2012-11-08 10:05:27 +0000
+++ src/maasserver/models/node.py	2012-12-18 13:00:43 +0000
@@ -24,12 +24,10 @@
     repeat,
     )
 import math
-import os
 import random
 from string import whitespace
 from uuid import uuid1
 
-from django.conf import settings
 from django.contrib.auth.models import User
 from django.core.exceptions import (
     PermissionDenied,
@@ -634,15 +632,10 @@
     def start_commissioning(self, user):
         """Install OS and self-test a new node."""
         # Avoid circular imports.
+        from metadataserver.commissioning.user_data import generate_user_data
         from metadataserver.models import NodeCommissionResult
 
-        path = settings.COMMISSIONING_SCRIPT
-        if not os.path.exists(path):
-            raise ValidationError(
-                "Commissioning script is missing: %s" % path)
-        with open(path, 'r') as f:
-            commissioning_user_data = f.read()
-
+        commissioning_user_data = generate_user_data()
         NodeCommissionResult.objects.clear_results(self)
         self.status = NODE_STATUS.COMMISSIONING
         self.owner = user

=== modified file 'src/maasserver/tests/test_node.py'
--- src/maasserver/tests/test_node.py	2012-12-12 11:25:17 +0000
+++ src/maasserver/tests/test_node.py	2012-12-18 13:00:43 +0000
@@ -15,7 +15,6 @@
 from datetime import timedelta
 import random
 
-from django.conf import settings
 from django.core.exceptions import (
     PermissionDenied,
     ValidationError,
@@ -50,6 +49,7 @@
     map_enum,
     )
 from maastesting.testcase import TestCase as DjangoLessTestCase
+from metadataserver import commissioning
 from metadataserver.enum import COMMISSIONING_STATUS
 from metadataserver.models import (
     NodeCommissionResult,
@@ -61,7 +61,6 @@
     AllMatch,
     Contains,
     Equals,
-    FileContains,
     MatchesAll,
     MatchesListwise,
     Not,
@@ -485,19 +484,12 @@
 
     def test_start_commissioning_sets_user_data(self):
         node = factory.make_node(status=NODE_STATUS.DECLARED)
+        user_data = factory.getRandomString().encode('ascii')
+        self.patch(
+            commissioning.user_data, 'generate_user_data'
+            ).return_value = user_data
         node.start_commissioning(factory.make_admin())
-        path = settings.COMMISSIONING_SCRIPT
-        self.assertThat(
-            path, FileContains(NodeUserData.objects.get_user_data(node)))
-
-    def test_missing_commissioning_script(self):
-        self.patch(
-            settings, 'COMMISSIONING_SCRIPT',
-            '/etc/' + factory.getRandomString(10))
-        node = factory.make_node(status=NODE_STATUS.DECLARED)
-        self.assertRaises(
-            ValidationError,
-            node.start_commissioning, factory.make_admin())
+        self.assertEqual(user_data, NodeUserData.objects.get_user_data(node))
 
     def test_start_commissioning_clears_node_commissioning_results(self):
         node = factory.make_node(status=NODE_STATUS.DECLARED)

=== added directory 'src/metadataserver/commissioning'
=== added file 'src/metadataserver/commissioning/__init__.py'
=== added directory 'src/metadataserver/commissioning/snippets'
=== added file 'src/metadataserver/commissioning/snippets/__init__.py'
=== added file 'src/metadataserver/commissioning/snippets/maas_api_helper.py'
--- src/metadataserver/commissioning/snippets/maas_api_helper.py	1970-01-01 00:00:00 +0000
+++ src/metadataserver/commissioning/snippets/maas_api_helper.py	2012-12-18 13:00:43 +0000
@@ -0,0 +1,114 @@
+from email.utils import parsedate
+import sys
+import time
+import urllib2
+
+import oauth.oauth as oauth
+import yaml
+
+
+__all__ = [
+    'geturl',
+    'read_config',
+    ]
+
+
+def read_config(url, creds):
+    """Read cloud-init config from given `url` into `creds` dict.
+
+    Updates any keys in `creds` that are None with their corresponding
+    values in the config.
+
+    Important keys include `metadata_url`, and the actual OAuth
+    credentials.
+    """
+    if url.startswith("http://";) or url.startswith("https://";):
+        cfg_str = urllib2.urlopen(urllib2.Request(url=url))
+    else:
+        if url.startswith("file://"):
+            url = url[7:]
+        cfg_str = open(url, "r").read()
+
+    cfg = yaml.safe_load(cfg_str)
+
+    # Support reading cloud-init config for MAAS datasource.
+    if 'datasource' in cfg:
+        cfg = cfg['datasource']['MAAS']
+
+    for key in creds.keys():
+        if key in cfg and creds[key] == None:
+            creds[key] = cfg[key]
+
+
+def oauth_headers(url, consumer_key, token_key, token_secret, consumer_secret,
+                  clockskew=0):
+    """Build OAuth headers using given credentials."""
+    consumer = oauth.OAuthConsumer(consumer_key, consumer_secret)
+    token = oauth.OAuthToken(token_key, token_secret)
+
+    timestamp = int(time.time()) + clockskew
+
+    params = {
+        'oauth_version': "1.0",
+        'oauth_nonce': oauth.generate_nonce(),
+        'oauth_timestamp': timestamp,
+        'oauth_token': token.key,
+        'oauth_consumer_key': consumer.key,
+    }
+    req = oauth.OAuthRequest(http_url=url, parameters=params)
+    req.sign_request(
+        oauth.OAuthSignatureMethod_PLAINTEXT(), consumer, token)
+    return(req.to_header())
+
+
+def authenticate_headers(url, headers, creds, clockskew):
+    """Update and sign a dict of request headers."""
+    if creds.get('consumer_key', None) != None:
+        headers.update(oauth_headers(
+            url,
+            consumer_key=creds['consumer_key'],
+            token_key=creds['token_key'],
+            token_secret=creds['token_secret'],
+            consumer_secret=creds['consumer_secret'],
+            clockskew=clockskew))
+
+
+def warn(msg):
+    sys.stderr.write(msg + "\n")
+
+
+def geturl(url, creds, headers=None, data=None):
+    # Takes a dict of creds to be passed through to oauth_headers,
+    #   so it should have consumer_key, token_key, ...
+    if headers is None:
+        headers = {}
+    else:
+        headers = dict(headers)
+
+    clockskew = 0
+
+    exc = Exception("Unexpected Error")
+    for naptime in (1, 1, 2, 4, 8, 16, 32):
+        authenticate_headers(url, headers, creds, clockskew)
+        try:
+            req = urllib2.Request(url=url, data=data, headers=headers)
+            return urllib2.urlopen(req).read()
+        except urllib2.HTTPError as exc:
+            if 'date' not in exc.headers:
+                warn("date field not in %d headers" % exc.code)
+                pass
+            elif exc.code in (401, 403):
+                date = exc.headers['date']
+                try:
+                    ret_time = time.mktime(parsedate(date))
+                    clockskew = int(ret_time - time.time())
+                    warn("updated clock skew to %d" % clockskew)
+                except:
+                    warn("failed to convert date '%s'" % date)
+        except Exception as exc:
+            pass
+
+        warn("request to %s failed. sleeping %d.: %s" % (url, naptime, exc))
+        time.sleep(naptime)
+
+    raise exc

=== added file 'src/metadataserver/commissioning/snippets/maas_get.py'
--- src/metadataserver/commissioning/snippets/maas_get.py	1970-01-01 00:00:00 +0000
+++ src/metadataserver/commissioning/snippets/maas_get.py	2012-12-18 13:00:43 +0000
@@ -0,0 +1,46 @@
+#!/usr/bin/python
+
+import sys
+
+from maas_api_helper import (
+    geturl,
+    read_config,
+    )
+
+
+MD_VERSION = "2012-03-01"
+
+
+def main():
+    """Authenticate, and download file from MAAS metadata API."""
+    import argparse
+
+    parser = argparse.ArgumentParser(
+        description="GET file from MAAS metadata API.")
+    parser.add_argument(
+        "--config", metavar="file",
+        help="Config file containing MAAS API credentials", default=None)
+    parser.add_argument("--apiver", metavar="version",
+        help="Use given API version", default=MD_VERSION)
+    parser.add_argument('path')
+
+    args = parser.parse_args()
+
+    creds = {
+        'consumer_key': None,
+        'token_key': None,
+        'token_secret': None,
+        'consumer_secret': None,
+        'metadata_url': None,
+    }
+    read_config(args.config, creds)
+    url = "%s/%s/%s" % (
+        creds['metadata_url'],
+        args.apiver,
+        args.path,
+        )
+
+    sys.stdout.write(geturl(url, creds))
+
+if __name__ == '__main__':
+    main()

=== added file 'src/metadataserver/commissioning/snippets/maas_signal.py'
--- src/metadataserver/commissioning/snippets/maas_signal.py	1970-01-01 00:00:00 +0000
+++ src/metadataserver/commissioning/snippets/maas_signal.py	2012-12-18 13:00:43 +0000
@@ -0,0 +1,184 @@
+#!/usr/bin/python
+
+import json
+import mimetypes
+import os.path
+import random
+import socket
+import string
+import sys
+import urllib2
+
+from maas_api_helper import (
+    geturl,
+    read_config,
+    )
+
+
+MD_VERSION = "2012-03-01"
+VALID_STATUS = ("OK", "FAILED", "WORKING")
+POWER_TYPES = ("ipmi", "virsh", "ether_wake")
+
+
+def _encode_field(field_name, data, boundary):
+    return ('--' + boundary,
+            'Content-Disposition: form-data; name="%s"' % field_name,
+            '', str(data))
+
+
+def _encode_file(name, fileObj, boundary):
+    return ('--' + boundary,
+            'Content-Disposition: form-data; name="%s"; filename="%s"' %
+                (name, name),
+            'Content-Type: %s' % _get_content_type(name),
+            '', fileObj.read())
+
+
+def _random_string(length):
+    return ''.join(random.choice(string.letters) for ii in range(length + 1))
+
+
+def _get_content_type(filename):
+    return mimetypes.guess_type(filename)[0] or 'application/octet-stream'
+
+
+def encode_multipart_data(data, files):
+    """Create a MIME multipart payload from L{data} and L{files}.
+
+    @param data: A mapping of names (ASCII strings) to data (byte string).
+    @param files: A mapping of names (ASCII strings) to file objects ready to
+        be read.
+    @return: A 2-tuple of C{(body, headers)}, where C{body} is a a byte string
+        and C{headers} is a dict of headers to add to the enclosing request in
+        which this payload will travel.
+    """
+    boundary = _random_string(30)
+
+    lines = []
+    for name in data:
+        lines.extend(_encode_field(name, data[name], boundary))
+    for name in files:
+        lines.extend(_encode_file(name, files[name], boundary))
+    lines.extend(('--%s--' % boundary, ''))
+    body = '\r\n'.join(lines)
+
+    headers = {'content-type': 'multipart/form-data; boundary=' + boundary,
+               'content-length': str(len(body))}
+
+    return body, headers
+
+
+def fail(msg):
+    sys.stderr.write("FAIL: %s" % msg)
+    sys.exit(1)
+
+
+def main():
+    """
+    Call with single argument of directory or http or https url.
+    If url is given additional arguments are allowed, which will be
+    interpreted as consumer_key, token_key, token_secret, consumer_secret.
+    """
+    import argparse
+
+    parser = argparse.ArgumentParser(
+        description='Send signal operation and optionally post files to MAAS')
+    parser.add_argument("--config", metavar="file",
+        help="Specify config file", default=None)
+    parser.add_argument("--ckey", metavar="key",
+        help="The consumer key to auth with", default=None)
+    parser.add_argument("--tkey", metavar="key",
+        help="The token key to auth with", default=None)
+    parser.add_argument("--csec", metavar="secret",
+        help="The consumer secret (likely '')", default="")
+    parser.add_argument("--tsec", metavar="secret",
+        help="The token secret to auth with", default=None)
+    parser.add_argument("--apiver", metavar="version",
+        help="The apiver to use ("" can be used)", default=MD_VERSION)
+    parser.add_argument("--url", metavar="url",
+        help="The data source to query", default=None)
+    parser.add_argument("--file", dest='files',
+        help="File to post", action='append', default=[])
+    parser.add_argument("--post", dest='posts',
+        help="name=value pairs to post", action='append', default=[])
+    parser.add_argument("--power-type", dest='power_type',
+        help="Power type.", choices=POWER_TYPES, default=None)
+    parser.add_argument("--power-parameters", dest='power_parms',
+        help="Power parameters.", default=None)
+
+    parser.add_argument("status",
+        help="Status", choices=VALID_STATUS, action='store')
+    parser.add_argument("message", help="Optional message",
+        default="", nargs='?')
+
+    args = parser.parse_args()
+
+    creds = {
+        'consumer_key': args.ckey,
+        'token_key': args.tkey,
+        'token_secret': args.tsec,
+        'consumer_secret': args.csec,
+        'metadata_url': args.url,
+        }
+
+    if args.config:
+        read_config(args.config, creds)
+
+    url = creds.get('metadata_url', None)
+    if not url:
+        fail("URL must be provided either in --url or in config\n")
+    url = "%s/%s/" % (url, args.apiver)
+
+    params = {
+        "op": "signal",
+        "status": args.status,
+        "error": args.message}
+
+    for ent in args.posts:
+        try:
+            (key, val) = ent.split("=", 2)
+        except ValueError:
+            sys.stderr.write("'%s' had no '='" % ent)
+            sys.exit(1)
+        params[key] = val
+
+    if args.power_parms is not None:
+        params["power_type"] = args.power_type
+        power_parms = dict(
+            power_user=args.power_parms.split(",")[0],
+            power_pass=args.power_parms.split(",")[1],
+            power_address=args.power_parms.split(",")[2]
+            )
+        params["power_parameters"] = json.dumps(power_parms)
+
+    files = {}
+    for fpath in args.files:
+        files[os.path.basename(fpath)] = open(fpath, "r")
+
+    data, headers = encode_multipart_data(params, files)
+
+    exc = None
+    msg = ""
+
+    try:
+        payload = geturl(url, creds=creds, headers=headers, data=data)
+        if payload != "OK":
+            raise TypeError("Unexpected result from call: %s" % payload)
+        else:
+            msg = "Success"
+    except urllib2.HTTPError as exc:
+        msg = "http error [%s]" % exc.code
+    except urllib2.URLError as exc:
+        msg = "url error [%s]" % exc.reason
+    except socket.timeout as exc:
+        msg = "socket timeout [%s]" % exc
+    except TypeError as exc:
+        msg = exc.message
+    except Exception as exc:
+        msg = "unexpected error [%s]" % exc
+
+    sys.stderr.write("%s\n" % msg)
+    sys.exit((exc is None))
+
+if __name__ == '__main__':
+    main()

=== added directory 'src/metadataserver/commissioning/tests'
=== added file 'src/metadataserver/commissioning/tests/__init__.py'
=== added file 'src/metadataserver/commissioning/tests/test_user_data.py'
--- src/metadataserver/commissioning/tests/test_user_data.py	1970-01-01 00:00:00 +0000
+++ src/metadataserver/commissioning/tests/test_user_data.py	2012-12-18 13:00:43 +0000
@@ -0,0 +1,74 @@
+# Copyright 2012 Canonical Ltd.  This software is licensed under the
+# GNU Affero General Public License version 3 (see the file LICENSE).
+
+"""Test generation of commissioning user data."""
+
+from __future__ import (
+    absolute_import,
+    print_function,
+    unicode_literals,
+    )
+
+__metaclass__ = type
+__all__ = []
+
+import os.path
+
+from maasserver.testing.factory import factory
+from maasserver.testing.testcase import TestCase
+from maastesting.matchers import ContainsAll
+from metadataserver.commissioning.user_data import (
+    generate_user_data,
+    is_snippet,
+    list_snippets,
+    read_snippet,
+    strip_name,
+    )
+
+
+class TestUserData(TestCase):
+
+    def test_read_snippet_reads_snippet_file(self):
+        contents = factory.getRandomString()
+        snippet = self.make_file(contents=contents)
+        self.assertEqual(
+            contents,
+            read_snippet(os.path.dirname(snippet), os.path.basename(snippet)))
+
+    def test_strip_name_leaves_simple_names_intact(self):
+        simple_name = factory.getRandomString()
+        self.assertEqual(simple_name, strip_name(simple_name))
+
+    def test_strip_name_replaces_dots(self):
+        self.assertEqual('_x_y_', strip_name('.x.y.'))
+
+    def test_is_snippet(self):
+        are_snippets = {
+            'snippet': True,
+            'with-dash': True,
+            'module.py': True,
+            '.backup': False,
+            'backup~': False,
+            'module.pyc': False,
+            '__init__.pyc': False,
+        }
+        self.assertEqual(
+            are_snippets,
+            {name: is_snippet(name) for name in are_snippets})
+
+    def test_list_snippets(self):
+        snippets_dir = self.make_dir()
+        factory.make_file(snippets_dir, 'snippet')
+        factory.make_file(snippets_dir, '.backup.pyc')
+        self.assertItemsEqual(['snippet'], list_snippets(snippets_dir))
+
+    def test_generate_user_data_produces_commissioning_script(self):
+        # generate_user_data produces a commissioning script which contains
+        # both definitions and use of various commands in python.
+        self.assertThat(
+            generate_user_data(), ContainsAll({
+                'maas-get',
+                'maas-signal',
+                'def authenticate_headers',
+                'def encode_multipart_data',
+            }))

=== added file 'src/metadataserver/commissioning/user_data.py'
--- src/metadataserver/commissioning/user_data.py	1970-01-01 00:00:00 +0000
+++ src/metadataserver/commissioning/user_data.py	2012-12-18 13:00:43 +0000
@@ -0,0 +1,81 @@
+# Copyright 2012 Canonical Ltd.  This software is licensed under the
+# GNU Affero General Public License version 3 (see the file LICENSE).
+
+"""Generate commissioning user-data from template and code snippets.
+
+This combines the `user_data.template` and the snippets of code in the
+`snippets` directory into the main commissioning script.
+
+Its contents are not customizable.  To inject custom code, use the
+:class:`CommissioningScript` model.
+"""
+
+from __future__ import (
+    absolute_import,
+    print_function,
+    unicode_literals,
+    )
+
+__metaclass__ = type
+__all__ = [
+    'generate_user_data',
+    ]
+
+from os import listdir
+import os.path
+
+import tempita
+
+
+def read_snippet(snippets_dir, name, encoding='utf-8'):
+    """Read a snippet file.
+
+    :rtype: `unicode`
+    """
+    path = os.path.join(snippets_dir, name)
+    with open(path, 'rb') as snippet_file:
+        return snippet_file.read().decode(encoding)
+
+
+def is_snippet(filename):
+    """Does `filename` represent a valid snippet name?"""
+    return all([
+        not filename.startswith('.'),
+        filename != '__init__.py',
+        not filename.endswith('.pyc'),
+        not filename.endswith('~'),
+        ])
+
+
+def list_snippets(snippets_dir):
+    """List names of available snippets."""
+    return filter(is_snippet, listdir(snippets_dir))
+
+
+def strip_name(snippet_name):
+    """Canonicalize a snippet name."""
+    # Dot suffixes do not work well in tempita variable names.
+    return snippet_name.replace('.', '_')
+
+
+def generate_user_data():
+    """Produce the main commissioning script.
+
+    The script was templated so that code snippets become easier to
+    maintain, check for lint, and ideally, unit-test.  However its
+    contents are static: there are no variables.  It's perfectly
+    cacheable.
+
+    :rtype: `bytes`
+    """
+    encoding = 'utf-8'
+    commissioning_dir = os.path.dirname(__file__)
+    template_file = os.path.join(commissioning_dir, 'user_data.template')
+    snippets_dir = os.path.join(commissioning_dir, 'snippets')
+    template = tempita.Template.from_filename(template_file, encoding=encoding)
+
+    snippets = {
+        strip_name(name): read_snippet(snippets_dir, name, encoding=encoding)
+        for name in list_snippets(snippets_dir)
+    }
+    return template.substitute(snippets).encode('utf-8')

=== renamed file 'etc/maas/commissioning-user-data' => 'src/metadataserver/commissioning/user_data.template'
--- etc/maas/commissioning-user-data	2012-12-13 07:01:12 +0000
+++ src/metadataserver/commissioning/user_data.template	2012-12-18 13:00:43 +0000
@@ -1,10 +1,12 @@
 #!/bin/sh
 #
 # This script carries inside it multiple files.  When executed, it creates
-# the files into a temporary directory, and then calls the 'main' function.
+# the files into a temporary directory, downloads and extracts commissioning
+# scripts from the metadata service, and then processes the scripts.
 #
-# main does a run-parts of all "scripts" and then calls home to maas with
-# maas-signal, posting output of each of the files added with add_script().
+# The commissioning scripts get run by a close equivalent of run-parts.
+# For each, the main script calls home to maas with maas-signal, posting
+# the script's output as a separate file.
 #
 ####  IPMI setup  ######
 # If IPMI network settings have been configured statically, you can
@@ -19,7 +21,7 @@
 
 #### script setup ######
 TEMP_D=$(mktemp -d "${TMPDIR:-/tmp}/${0##*/}.XXXXXX")
-SCRIPTS_D="${TEMP_D}/scripts"
+SCRIPTS_D="${TEMP_D}/commissioning.d"
 IPMI_CONFIG_D="${TEMP_D}/ipmi.d"
 BIN_D="${TEMP_D}/bin"
 OUT_D="${TEMP_D}/out"
@@ -33,18 +35,10 @@
    DEBIAN_FRONTEND=noninteractive apt-get --assume-yes -q "$@" </dev/null
 }
 
-writefile() {
-   cat > "$1"
-   chmod "$2" "$1"
-}
 add_bin() {
    cat > "${BIN_D}/$1"
    chmod "${2:-755}" "${BIN_D}/$1"
 }
-add_script() {
-   cat > "${SCRIPTS_D}/$1"
-   chmod "${2:-755}" "${SCRIPTS_D}/$1"
-}
 add_ipmi_config() {
    cat > "${IPMI_CONFIG_D}/$1"
    chmod "${2:-644}" "${IPMI_CONFIG_D}/$1"
@@ -145,6 +139,10 @@
       signal "--power-type=ipmi" "--power-parameters=${power_settings}" WORKING "finished [maas-ipmi-autodetect]"
    fi
 
+   # Download and extract commissioning scripts.  It will contain a
+   # commissioning.d directory, so this is how $SCRIPTS_D is created.
+   maas-get --config="${CRED_CFG}" maas-commissioning-scripts | tar -C "${TEMP_D}" -x
+
    # Just get a count of how many scripts there are for progress reporting.
    for script in "${SCRIPTS_D}/"*; do
       [ -x "$script" -a -f "$script" ] || continue
@@ -193,10 +191,6 @@
 }
 
 ### begin writing files ###
-add_script "00-maas-01-lshw" <<"END_LSHW"
-#!/bin/sh
-lshw -xml
-END_LSHW
 
 add_ipmi_config "01-user-privileges.ipmi" <<"END_IPMI_USER_PRIVILEGES"
 Section User3
@@ -330,270 +324,18 @@
     main()
 END_MAAS_IPMI_AUTODETECT
 
+add_bin "maas_api_helper.py" <<"END_MAAS_API_HELPER"
+{{maas_api_helper_py}}
+END_MAAS_API_HELPER
+
 add_bin "maas-signal" <<"END_MAAS_SIGNAL"
-#!/usr/bin/python
-
-from email.utils import parsedate
-import mimetypes
-import oauth.oauth as oauth
-import os.path
-import random
-import string
-import sys
-import time
-import urllib2
-import yaml
-import json
-
-MD_VERSION = "2012-03-01"
-VALID_STATUS = ("OK", "FAILED", "WORKING")
-POWER_TYPES = ("ipmi", "virsh", "ether_wake")
-
-
-def _encode_field(field_name, data, boundary):
-    return ('--' + boundary,
-            'Content-Disposition: form-data; name="%s"' % field_name,
-            '', str(data))
-
-
-def _encode_file(name, fileObj, boundary):
-    return ('--' + boundary,
-            'Content-Disposition: form-data; name="%s"; filename="%s"' %
-                (name, name),
-            'Content-Type: %s' % _get_content_type(name),
-            '', fileObj.read())
-
-
-def _random_string(length):
-    return ''.join(random.choice(string.letters) for ii in range(length + 1))
-
-
-def _get_content_type(filename):
-    return mimetypes.guess_type(filename)[0] or 'application/octet-stream'
-
-
-def encode_multipart_data(data, files):
-    """Create a MIME multipart payload from L{data} and L{files}.
-
-    @param data: A mapping of names (ASCII strings) to data (byte string).
-    @param files: A mapping of names (ASCII strings) to file objects ready to
-        be read.
-    @return: A 2-tuple of C{(body, headers)}, where C{body} is a a byte string
-        and C{headers} is a dict of headers to add to the enclosing request in
-        which this payload will travel.
-    """
-    boundary = _random_string(30)
-
-    lines = []
-    for name in data:
-        lines.extend(_encode_field(name, data[name], boundary))
-    for name in files:
-        lines.extend(_encode_file(name, files[name], boundary))
-    lines.extend(('--%s--' % boundary, ''))
-    body = '\r\n'.join(lines)
-
-    headers = {'content-type': 'multipart/form-data; boundary=' + boundary,
-               'content-length': str(len(body))}
-
-    return body, headers
-
-
-def oauth_headers(url, consumer_key, token_key, token_secret, consumer_secret,
-                  clockskew=0):
-    consumer = oauth.OAuthConsumer(consumer_key, consumer_secret)
-    token = oauth.OAuthToken(token_key, token_secret)
-
-    timestamp = int(time.time()) + clockskew
-
-    params = {
-        'oauth_version': "1.0",
-        'oauth_nonce': oauth.generate_nonce(),
-        'oauth_timestamp': timestamp,
-        'oauth_token': token.key,
-        'oauth_consumer_key': consumer.key,
-    }
-    req = oauth.OAuthRequest(http_url=url, parameters=params)
-    req.sign_request(oauth.OAuthSignatureMethod_PLAINTEXT(),
-        consumer, token)
-    return(req.to_header())
-
-
-def geturl(url, creds, headers=None, data=None):
-    # Takes a dict of creds to be passed through to oauth_headers,
-    #   so it should have consumer_key, token_key, ...
-    if headers is None:
-        headers = {}
-    else:
-        headers = dict(headers)
-
-    clockskew = 0
-
-    def warn(msg):
-        sys.stderr.write(msg + "\n")
-
-    exc = Exception("Unexpected Error")
-    for naptime in (1, 1, 2, 4, 8, 16, 32):
-        if creds.get('consumer_key', None) != None:
-            headers.update(oauth_headers(url,
-                consumer_key=creds['consumer_key'],
-                token_key=creds['token_key'],
-                token_secret=creds['token_secret'],
-                consumer_secret=creds['consumer_secret'],
-                clockskew=clockskew))
-        try:
-            req = urllib2.Request(url=url, data=data, headers=headers)
-            return(urllib2.urlopen(req).read())
-        except urllib2.HTTPError as exc:
-            if 'date' not in exc.headers:
-                warn("date field not in %d headers" % exc.code)
-                pass
-            elif (exc.code == 401 or exc.code == 403):
-                date = exc.headers['date']
-                try:
-                    ret_time = time.mktime(parsedate(date))
-                    clockskew = int(ret_time - time.time())
-                    warn("updated clock skew to %d" % clockskew)
-                except:
-                    warn("failed to convert date '%s'" % date)
-        except Exception as exc:
-            pass
-
-        warn("request to %s failed. sleeping %d.: %s" % (url, naptime, exc))
-        time.sleep(naptime)
-
-    raise exc
-
-
-def read_config(url, creds):
-    if url.startswith("http://";) or url.startswith("https://";):
-        cfg_str = urllib2.urlopen(urllib2.Request(url=url))
-    else:
-        if url.startswith("file://"):
-            url = url[7:]
-        cfg_str = open(url,"r").read()
-
-    cfg = yaml.safe_load(cfg_str)
-
-    # Support reading cloud-init config for MAAS datasource.
-    if 'datasource' in cfg:
-        cfg = cfg['datasource']['MAAS']
-
-    for key in creds.keys():
-        if key in cfg and creds[key] == None:
-            creds[key] = cfg[key]
-
-def fail(msg):
-    sys.stderr.write("FAIL: %s" % msg)
-    sys.exit(1)
-
-
-def main():
-    """
-    Call with single argument of directory or http or https url.
-    If url is given additional arguments are allowed, which will be
-    interpreted as consumer_key, token_key, token_secret, consumer_secret.
-    """
-    import argparse
-    import pprint
-
-    parser = argparse.ArgumentParser(
-        description='Send signal operation and optionally post files to MAAS')
-    parser.add_argument("--config", metavar="file",
-        help="Specify config file", default=None)
-    parser.add_argument("--ckey", metavar="key",
-        help="The consumer key to auth with", default=None)
-    parser.add_argument("--tkey", metavar="key",
-        help="The token key to auth with", default=None)
-    parser.add_argument("--csec", metavar="secret",
-        help="The consumer secret (likely '')", default="")
-    parser.add_argument("--tsec", metavar="secret",
-        help="The token secret to auth with", default=None)
-    parser.add_argument("--apiver", metavar="version",
-        help="The apiver to use ("" can be used)", default=MD_VERSION)
-    parser.add_argument("--url", metavar="url",
-        help="The data source to query", default=None)
-    parser.add_argument("--file", dest='files',
-        help="File to post", action='append', default=[])
-    parser.add_argument("--post", dest='posts',
-        help="name=value pairs to post", action='append', default=[])
-    parser.add_argument("--power-type", dest='power_type',
-        help="Power type.", choices=POWER_TYPES, default=None)
-    parser.add_argument("--power-parameters", dest='power_parms',
-        help="Power parameters.", default=None)
-
-    parser.add_argument("status",
-        help="Status", choices=VALID_STATUS, action='store')
-    parser.add_argument("message", help="Optional message",
-        default="", nargs='?')
-
-    args = parser.parse_args()
-
-    creds = {'consumer_key': args.ckey, 'token_key': args.tkey,
-        'token_secret': args.tsec, 'consumer_secret': args.csec,
-        'metadata_url': args.url}
-
-    if args.config:
-        read_config(args.config, creds)
-
-    url = creds.get('metadata_url', None)
-    if not url:
-        fail("URL must be provided either in --url or in config\n")
-    url = "%s/%s/" % (url, args.apiver)
-
-    params = {
-        "op": "signal",
-        "status": args.status,
-        "error": args.message}
-
-    for ent in args.posts:
-        try:
-           (key, val) = ent.split("=", 2)
-        except ValueError:
-           sys.stderr.write("'%s' had no '='" % ent)
-           sys.exit(1)
-        params[key] = val
-
-    if args.power_parms is not None:
-        params["power_type"] = args.power_type
-        power_parms = dict(
-            power_user=args.power_parms.split(",")[0],
-            power_pass=args.power_parms.split(",")[1],
-            power_address=args.power_parms.split(",")[2]
-            )
-        params["power_parameters"] = json.dumps(power_parms)
-
-    files = {}
-    for fpath in args.files:
-        files[os.path.basename(fpath)] = open(fpath, "r")
-
-    data, headers = encode_multipart_data(params, files)
-
-    exc = None
-    msg = ""
-
-    try:
-        payload = geturl(url, creds=creds, headers=headers, data=data)
-        if payload != "OK":
-            raise TypeError("Unexpected result from call: %s" % payload)
-        else:
-            msg = "Success"
-    except urllib2.HTTPError as exc:
-        msg = "http error [%s]" % exc.code
-    except urllib2.URLError as exc:
-        msg = "url error [%s]" % exc.reason
-    except socket.timeout as exc:
-        msg = "socket timeout [%s]" % exc
-    except TypeError as exc:
-        msg = exc.message
-    except Exception as exc:
-        msg = "unexpected error [%s]" % exc
-
-    sys.stderr.write("%s\n" % msg)
-    sys.exit((exc is None))
-
-if __name__ == '__main__':
-    main()
+{{maas_signal_py}}
 END_MAAS_SIGNAL
 
+add_bin "maas-get" <<END_MAAS_GET
+{{maas_get_py}}
+END_MAAS_GET
+
+
 main
 exit


Follow ups