← Back to team overview

bigdata-dev team mailing list archive

[Merge] lp:~bigdata-dev/charms/trusty/apache-hadoop-hdfs-master/smoke-test into lp:~bigdata-dev/charms/trusty/apache-hadoop-hdfs-master/trunk

 

Cory Johns has proposed merging lp:~bigdata-dev/charms/trusty/apache-hadoop-hdfs-master/smoke-test into lp:~bigdata-dev/charms/trusty/apache-hadoop-hdfs-master/trunk.

Requested reviews:
  Juju Big Data Development (bigdata-dev)

For more details, see:
https://code.launchpad.net/~bigdata-dev/charms/trusty/apache-hadoop-hdfs-master/smoke-test/+merge/268263

Added smoke-test action and relevant README section
-- 
Your team Juju Big Data Development is requested to review the proposed merge of lp:~bigdata-dev/charms/trusty/apache-hadoop-hdfs-master/smoke-test into lp:~bigdata-dev/charms/trusty/apache-hadoop-hdfs-master/trunk.
=== modified file 'README.md'
--- README.md	2015-08-07 21:30:37 +0000
+++ README.md	2015-08-17 18:17:22 +0000
@@ -27,6 +27,35 @@
     hadoop jar my-job.jar
 
 
+## Status and Smoke Test
+
+The services provide extended status reporting to indicate when they are ready:
+
+    juju status --format=tabular
+
+This is particularly useful when combined with `watch` to track the on-going
+progress of the deployment:
+
+    watch -n 0.5 juju status --format=tabular
+
+The message for each unit will provide information about that unit's state.
+Once they all indicate that they are ready, you can perform a "smoke test"
+to verify that HDFS is working as expected using the built-in `smoke-test`
+action:
+
+    juju action do smoke-test
+
+After a few seconds or so, you can check the results of the smoke test:
+
+    juju action status
+
+You will see `status: completed` if the smoke test was successful, or
+`status: failed` if it was not.  You can get more information on why it failed
+via:
+
+    juju action fetch <action-id>
+
+
 ## Deploying in Network-Restricted Environments
 
 The Apache Hadoop charms can be deployed in environments with limited network

=== modified file 'actions.yaml'
--- actions.yaml	2015-06-23 17:15:23 +0000
+++ actions.yaml	2015-08-17 18:17:22 +0000
@@ -4,3 +4,5 @@
     description: All of the HDFS processes can be stopped with this Juju action.
 restart-hdfs:
     description: All of the HDFS processes can be restarted with this Juju action.
+smoke-test:
+    description: Verify that HDFS is working by creating and removing a small file.

=== added file 'actions/smoke-test'
--- actions/smoke-test	1970-01-01 00:00:00 +0000
+++ actions/smoke-test	2015-08-17 18:17:22 +0000
@@ -0,0 +1,53 @@
+#!/usr/bin/env python
+
+import sys
+
+try:
+    from charmhelpers.core import hookenv
+    from charmhelpers.core import unitdata
+    from jujubigdata.utils import run_as
+    charm_ready = unitdata.kv().get('charm.active', False)
+except ImportError:
+    charm_ready = False
+
+if not charm_ready:
+    # might not have hookenv.action_fail available yet
+    from subprocess import call
+    call(['action-fail', 'HDFS service not yet ready'])
+
+
+# verify the hdfs-test directory does not already exist
+output = run_as('ubuntu', 'hdfs', 'dfs', '-ls', '/tmp', capture_output=True)
+if '/tmp/hdfs-test' in output:
+    run_as('ubuntu', 'hdfs', 'dfs', '-rm', '-R', '/tmp/hdfs-test')
+    output = run_as('ubuntu', 'hdfs', 'dfs', '-ls', '/tmp', capture_output=True)
+    if 'hdfs-test' in output:
+        hookenv.action_fail('Unable to remove existing hdfs-test directory')
+        sys.exit()
+
+# create the directory
+run_as('ubuntu', 'hdfs', 'dfs', '-mkdir', '-p', '/tmp/hdfs-test')
+run_as('ubuntu', 'hdfs', 'dfs', '-chmod', '-R', '777', '/tmp/hdfs-test')
+
+# verify the newly created hdfs-test subdirectory exists
+output = run_as('ubuntu', 'hdfs', 'dfs', '-ls', '/tmp', capture_output=True)
+for line in output.split('\n'):
+    if '/tmp/hdfs-test' in line:
+        if 'ubuntu' not in line or 'drwxrwxrwx' not in line:
+            hookenv.action_fail('Permissions incorrect for hdfs-test directory')
+            sys.exit()
+        break
+else:
+    hookenv.action_fail('Unable to create hdfs-test directory')
+    sys.exit()
+
+# remove the directory
+run_as('ubuntu', 'hdfs', 'dfs', '-rm', '-R', '/tmp/hdfs-test')
+
+# verify the hdfs-test subdirectory has been removed
+output = run_as('ubuntu', 'hdfs', 'dfs', '-ls', '/tmp', capture_output=True)
+if '/tmp/hdfs-test' in output:
+    hookenv.action_fail('Unable to remove hdfs-test directory')
+    sys.exit()
+
+hookenv.action_set({'outcome': 'success'})


Follow ups