yade-dev team mailing list archive
-
yade-dev team
-
Mailing list archive
-
Message #01096
[svn] r1714 - trunk/examples/collider-perf
Author: eudoxos
Date: 2009-03-09 00:03:42 +0100 (Mon, 09 Mar 2009)
New Revision: 1714
Added:
trunk/examples/collider-perf/mkGraph.py
Modified:
trunk/examples/collider-perf/README
trunk/examples/collider-perf/perf.table
Log:
1. Fixed collider performance benchmark
2. Added script to generate graph from log files (like http://yade.wikia.com/wiki/Colliders_performace)
Modified: trunk/examples/collider-perf/README
===================================================================
--- trunk/examples/collider-perf/README 2009-03-08 21:15:35 UTC (rev 1713)
+++ trunk/examples/collider-perf/README 2009-03-08 23:03:42 UTC (rev 1714)
@@ -5,10 +5,13 @@
yade-trunk-multi perf.table perf.py
-and wait. If you feel brave enough, uncomment lines with many spheres in perf.table.
-They take very long time (128k spheres with SAP collider over 3 hours)
+It will take time to finish. To get graphs from the generated log files, say
+ python mkGraph.py *.log
+To see other results, log files are named like perf.64k.log (64k spheres with
+PersistentSAPCollider), perf.32k.q.log (32k spheres, SpatialQuickSortCollider) etc.
+
1. File with nSpheres spheres (loose packing) is generated first, it if doesn't exist yet.
This can take a few minutes, but is done only the first time for the particular sphere
number.
@@ -16,6 +19,3 @@
done and timing.stats() printed (appears in the log file).
3. Another 100 iterations are measured with siming.stats(), after which the test exits.
-To get the results, grep the resulting log files, named like perf.64k.log (64k spheres with
-PersistentSAPCollider), perf.32k.q.log (32k spheres, SpatialQuickSortCollider) etc.
-
Added: trunk/examples/collider-perf/mkGraph.py
===================================================================
--- trunk/examples/collider-perf/mkGraph.py 2009-03-08 21:15:35 UTC (rev 1713)
+++ trunk/examples/collider-perf/mkGraph.py 2009-03-08 23:03:42 UTC (rev 1714)
@@ -0,0 +1,35 @@
+#encoding: utf-8
+dta={'QS':{},'SAP':{}}
+import sys
+for f in sys.argv[1:]:
+ print f,'',
+ N=f.split('.')[1];
+ assert(N[-1]=='k'); N=1000*int(N[:-1])
+ if '.q.' in f: collider='QS'
+ else: collider='SAP'
+ for l in open(f):
+ if 'Collider' in l:
+ t=l.split()[2]; assert(t[-2:]=='us'); t=float(t[:-2])/1e6
+ if not dta[collider].has_key(N): dta[collider][N]=[t]
+ else: dta[collider][N]+=[t*0.01] # the second time is per 100 iterations
+print
+
+SAP_N=dta['SAP'].keys(); SAP_N.sort()
+QS_N=dta['QS'].keys(); QS_N.sort()
+SAPinit=[dta['SAP'][N][0] for N in SAP_N]; SAPstep=[dta['SAP'][N][1] for N in SAP_N]
+QSinit=[dta['QS'][N][0] for N in QS_N]; QSstep=[dta['QS'][N][1] for N in QS_N]
+from pylab import *
+plot(SAP_N,SAPinit,'m')
+gca().set_yscale('log')
+xlabel("Number of spheres")
+ylabel(u"Log (!) time for the 1st SAP collider step [s]")
+title("SAP vs. QuickSort colliders performance")
+legend(('SAP init',),'upper left')
+
+ax2=twinx()
+plot(SAP_N,SAPstep,'r-',QS_N,QSstep,'g-',QS_N,QSinit,'b-')
+ylabel(u"Linear time per 1 step [s]")
+legend(('SAP step','QuickSort step','QuickSort init'),'right')
+grid()
+savefig('colliders.svg')
+show()
Modified: trunk/examples/collider-perf/perf.table
===================================================================
--- trunk/examples/collider-perf/perf.table 2009-03-08 21:15:35 UTC (rev 1713)
+++ trunk/examples/collider-perf/perf.table 2009-03-08 23:03:42 UTC (rev 1714)
@@ -1,10 +1,10 @@
description nSpheres collider
-#128k 128000 'PersistentSAPCollider'
-#96k 96000 'PersistentSAPCollider'
-#64k 64000 'PersistentSAPCollider'
-#56k 56000 'PersistentSAPCollider'
-#48k 48000 'PersistentSAPCollider'
-#40k 40000 'PersistentSAPCollider'
+128k 128000 'PersistentSAPCollider'
+96k 96000 'PersistentSAPCollider'
+64k 64000 'PersistentSAPCollider'
+56k 56000 'PersistentSAPCollider'
+48k 48000 'PersistentSAPCollider'
+40k 40000 'PersistentSAPCollider'
36k 36000 'PersistentSAPCollider'
32k 32000 'PersistentSAPCollider'
28k 28000 'PersistentSAPCollider'