← Back to team overview

canonical-ubuntu-qa team mailing list archive

[Merge] ~andersson123/autopkgtest-cloud:apache_logging_monitoring into autopkgtest-cloud:master

 

Tim Andersson has proposed merging ~andersson123/autopkgtest-cloud:apache_logging_monitoring into autopkgtest-cloud:master.

Requested reviews:
  Canonical's Ubuntu QA (canonical-ubuntu-qa)

For more details, see:
https://code.launchpad.net/~andersson123/autopkgtest-cloud/+git/autopkgtest-cloud/+merge/448615
-- 
Your team Canonical's Ubuntu QA is requested to review the proposed merge of ~andersson123/autopkgtest-cloud:apache_logging_monitoring into autopkgtest-cloud:master.
diff --git a/charms/focal/autopkgtest-web/webcontrol/apache_request_monitoring b/charms/focal/autopkgtest-web/webcontrol/apache_request_monitoring
new file mode 100755
index 0000000..3ca3c9b
--- /dev/null
+++ b/charms/focal/autopkgtest-web/webcontrol/apache_request_monitoring
@@ -0,0 +1,55 @@
+#!/usr/bin/python3
+'''
+Logs exit codes from HTTP test requests to InfluxDB for viewing in Grafana
+
+'''
+
+import time
+import subprocess
+import select
+import os
+
+from influxdb import InfluxDBClient
+
+
+HOSTNAME = os.environ["INFLUXDB_HOSTNAME"]
+PORT = os.environ["INFLUXDB_PORT"]
+USERNAME = os.environ["INFLUXDB_USERNAME"]
+PASSWORD = os.environ["INFLUXDB_PASSWORD"]
+DATABASE = os.environ["INFLUXDB_DATABASE"]
+
+INFLUX_CLIENT = InfluxDBClient(HOSTNAME, PORT, USERNAME,
+                               PASSWORD, DATABASE)
+
+
+f = subprocess.Popen(['tail', '-F', '/var/log/apache2/access.log'],
+                     stdout=subprocess.PIPE, stderr=subprocess.PIPE)
+
+p = select.poll()
+p.register(f.stdout)
+
+data = {}
+
+upload_interval_seconds = 10
+upload_start = time.time()
+
+while True:
+    if p.poll(1):
+        line = f.stdout.readline()
+        if "GET" in str(line):
+            code = str(line).split(" ")[8]
+            if code not in data:
+                data[code] = {
+                    "measurement": "apache2_requests",
+                    "fields": {"count": 0},
+                    "tags": {}
+                }
+                data[code]["tags"]["code"] = code
+            data[code]["fields"]["count"] += 1
+        if (time.time() - upload_start) > upload_interval_seconds:
+            for d, i in data.items():
+                # not sure if the below works
+                INFLUX_CLIENT.write_points([i])
+                # INFLUX_CLIENT.write_points([data[d]])
+            data = {}
+            upload_start = time.time()