haikuwebkit/PerformanceTests/Dromaeo/resources/dromaeorunner.js

83 lines
2.9 KiB
JavaScript
Raw Permalink Normal View History

(function(){
Record subtest values in Dromaeo tests https://bugs.webkit.org/show_bug.cgi?id=124498 Reviewed by Andreas Kling. PerformanceTests: Made Dromaeo's test runner report values in DRT.progress via newly added PerfTestRunner.reportValues. * Dromaeo/resources/dromaeorunner.js: (.): Moved the definition out of DRT.setup. (DRT.setup): Ditto. (DRT.testObject): Extracted from DRT.setup. Set the subtest name and continueTesting. continueTesting is set true for subtests; i.e. when name is specified. (DRT.progress): Call PerfTestRunner.reportValues to report subtest results. (DRT.teardown): Call PerfTestRunner.reportValues instead of measureValueAsync. * resources/runner.js: Made various changes for newly added PerfTestRunner.reportValues. (.): Moved the initialization of completedIterations, results, jsHeapResults, and mallocHeapResults into start since they need to be initialized before running each subtest. Initialize logLines here since we need to use the same logger for all subtests. (.start): Initialize the variables mentioned above here. Also respect doNotLogStart used by reportValues. (ignoreWarmUpAndLog): Added doNotLogProgress. Used by reportValues since it reports all values at once. (finish): Compute the metric name such as FrameFrame and Runs from unit. Also don't log or notify done when continueTesting is set on the test object. (PerfTestRunner.reportValues): Added. Reports all values for the main/sub test. Tools: Supported parsing subtest results. * Scripts/webkitpy/performance_tests/perftest.py: Replaced _metrics with an ordered list of subtests, each of which contains a dictionary with its name and an ordered list of subtest's metrics. (PerfTest.__init__): Initialize _metrics as a list. (PerfTest.run): Go through each subtest and its metrics to create a list of TestMetrics. (PerfTest._run_with_driver): (PerfTest._ensure_metrics): Look for a subtest then a metric in _metrics. * Scripts/webkitpy/performance_tests/perftest_unittest.py: (TestPerfTest._assert_results_are_correct): Updated the assertions per changes to _metrics. (TestPerfTest.test_parse_output): Ditto. (TestPerfTest.test_parse_output_with_subtests): Added the metric and the unit on each subtest result as well as assertions to ensure subtest results are parsed properly. (TestReplayPerfTest.test_run_with_driver_accumulates_results): Updated the assertions per changes to _metrics. (TestReplayPerfTest.test_run_with_driver_accumulates_memory_results): Dittp. * Scripts/webkitpy/performance_tests/perftestsrunner.py: (_generate_results_dict): When the metric for a subtest is processed before that of the main test, the url is incorrectly suffixed with '/'. Fix this later by re-computing the url with TestPerfMetric.test_file_name when adding new results. * Scripts/webkitpy/performance_tests/perftestsrunner_integrationtest.py: (TestWithSubtestsData): Added. (TestDriver.run_test): (MainTest.test_run_test_with_subtests): Added. LayoutTests: Rebaselined the test. * fast/harness/perftests/runs-per-second-log-expected.txt: Canonical link: https://commits.webkit.org/143056@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@159805 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2013-11-27 05:33:17 +00:00
var ITERATION_COUNT = 5;
var DRT = {
baseURL: "./resources/dromaeo/web/index.html",
setup: function(testName) {
var iframe = document.createElement("iframe");
Some perf. tests have variances that differ greatly between runs https://bugs.webkit.org/show_bug.cgi?id=97510 Reviewed by Benjamin Poulain. PerformanceTests: In order to control the number of iterations and processes to use from run-perf-tests, always use 20 iterations on all tests except Dromaeo, where even doing 5 iterations is prohibitively slow, by default. Without this change, it'll become extremely hard for us to tweak the number of iterations and processes to use from run-perf-tests. * Animation/balls.html: * DOM/DOMTable.html: * DOM/resources/dom-perf.js: (runBenchmarkSuite.PerfTestRunner.measureTime): * Dromaeo/resources/dromaeorunner.js: * Layout/floats_100_100.html: * Layout/floats_100_100_nested.html: * Layout/floats_20_100.html: * Layout/floats_20_100_nested.html: * Layout/floats_2_100.html: * Layout/floats_2_100_nested.html: * Layout/floats_50_100.html: * Layout/floats_50_100_nested.html: * Layout/subtree-detaching.html: * Parser/html5-full-render.html: * SVG/SvgHitTesting.html: * resources/runner.js: * resources/results-template.html: Tools: Use multiple instances of DumpRenderTree or WebKitTestRunner to amortize the effect of the runtime environment on test results (we run each instance after one another, not in parallel). We use 4 instances of the test runner, each executing 5 in-process iterations, for the total of 20 iterations as it was done previously in single process. These values are hard-coded in perftest.py and runner.js but they are to be configurable in the future. Set of 5 iterations obtained by the same test runner is treated as an "iteration group" and each metric now reports an array of the length 4 with each element containing an array of 5 iteration values obtained by each test runner instance as opposed to a flattened array of 20 iteration values. Unfortunately, we can use the same trick on Dromaeo because we're already doing only 5 iterations and repeating the entire Dromaeo 4 times will take too long. We need to disable more Dromaeo tests as needed. To this end, added SingleProcessPerfTest to preserve the old behavior. * Scripts/webkitpy/performance_tests/perftest.py: (PerfTestMetric.append_group): Renamed from append. (PerfTestMetric.grouped_iteration_values): Added. (PerfTestMetric.flattened_iteration_values): Renamed from iteration_values. (PerfTest.__init__): Takes the number of processes (drivers) to run tests with. This parameter is only used by SingleProcessPerfTest. (PerfTest.run): Repeat tests using different driver processes. (PerfTest._run_with_driver): Returns a boolean instead of a list of measured metrics since metrics are shared between multiple drivers (i.e. multiple calls to _run_with_driver). We instead use _ensure_metrics to obtain the matched metrics and store the data there. (PerfTest._ensure_metrics): Added. (SingleProcessPerfTest): Added. Used to run Dromaeo tests where running it on 4 different instances of DumpRenderTree/WebKitTestRunner takes too long. (SingleProcessPerfTest.__init__): (ReplayPerfTest._run_with_driver): Updated to use _ensure_metrics. (PerfTestFactory): Use SingleProcessPerfTest to run Dromaeo tests. * Scripts/webkitpy/performance_tests/perftest_unittest.py: Updated various tests that expect _run_with_driver to return a list of metrics. Now it returns a boolean indicating whether the test succeeded or not. Obtain the dictionary of metrics via test._metrics instead. (TestPerfTestMetric.test_append): Updated per name and added some test cases for grouped_iteration_values. (TestPerfTest._assert_results_are_correct): (TestSingleProcessPerfTest): Added. (TestSingleProcessPerfTest.test_use_only_one_process): (TestSingleProcessPerfTest.test_use_only_one_process.run_single): (TestReplayPerfTest.test_run_with_driver_accumulates_results): (TestReplayPerfTest.test_run_with_driver_accumulates_memory_results): * Scripts/webkitpy/performance_tests/perftestsrunner_integrationtest.py: Updated values of sample standard deviations since we're now running tests 4 times. (MainTest._test_run_with_json_output.mock_upload_json): (MainTest.test_run_with_upload_json_should_generate_perf_webkit_json): LayoutTests: Use dromaeoIterationCount now that we no longer support iterationCount. * fast/harness/perftests/runs-per-second-iterations.html: Canonical link: https://commits.webkit.org/129631@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@144583 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2013-03-03 23:19:31 +00:00
var url = DRT.baseURL + "?" + testName + '&numTests=' + ITERATION_COUNT;
iframe.setAttribute("src", url);
PerformanceTests: Dromaeo should report individual test result https://bugs.webkit.org/show_bug.cgi?id=99800 Reviewed by Eric Seidel. Made one small modification to Droameo's webrunner.js so that it reports individual runs/s values for each subtest. This allows us to compute the aggregated run/s for each iteration like other performance tests. Also stop measuring memory usage in Dromaeo tests because some Dromaeo tests (e.g. jslib-modify-jquery) have unrealistic memory usage, and measuring them at the time of teardown doesn't make much sense. * Animation/balls.html: Fixed typo: measureValueAync. * Dromaeo/resources/dromaeo/web/webrunner.js: * Dromaeo/resources/dromaeorunner.js: (DRT.setup): Call prepareToMeasureValuesAsync so that DRT.teardown can use meausreValueAsync, and log "Running 5 times". Since the log container will be inserted before iframe, we need to explicitly insert the iframe as the first child of the body element to avoid logs from affecting the iframe's position. Also specify the number of iterations by calling PerfTestRunner.iterationCount() so that we may adjust the number of iterations in PerfTestRunner. (DRT.progress): Log individual measurement for each subtest. (DRT.teardown): Compute the aggregated result for each iteration, and log them using measureValueAsync. * resources/runner.js: (PerfTestRunner.logStatistics): Merged printStatistics since it's no longer needed after r131638. (PerfTestRunner): Removed getAndPrintMemoryStatistics since it was used only in Dromaeo tests but we no longer measure memory usage in Dromaeo tests. (start): Increment completedRuns from -1 to 0 for Dromaeo tests where we don't want to ignore the initial measurement. Note that ignoreWarmUpAndLog ignores the measurements for which completedRuns is negative. (ignoreWarmUpAndLog): We don't measure memory usage in Dromaeo tests. See above. (PerfTestRunner.iterationCount): Added. This abstraction allows us to auto-adjust the number of iterations from run-perf-tests in near future. (PerfTestRunner.measureValueAsync): Renamed from measureValueAync. Tools: Dromaeo should report individual test result https://bugs.webkit.org/show_bug.cgi?id=99800 Reviewed by Eric Seidel. Ignore subtest results spit out by Dromaeo tests. * Scripts/webkitpy/performance_tests/perftest.py: (PerfTest): Added a line to ignore. * Scripts/webkitpy/performance_tests/perftest_unittest.py: (MainTest.test_parse_output_with_subtests): Added. LayoutTests: Fix a test and re-enable fast/harness/perftests on Chromium. * fast/harness/perftests/runs-per-second-log.html: * platform/chromium/TestExpectations: Canonical link: https://commits.webkit.org/122080@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@136492 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2012-12-04 10:00:54 +00:00
document.body.insertBefore(iframe, document.body.firstChild);
iframe.addEventListener(
"load", function() {
DRT.targetDocument = iframe.contentDocument;
DRT.targetWindow = iframe.contentDocument.defaultView;
});
window.addEventListener(
"message",
function(event) {
switch(event.data.name) {
case "dromaeo:ready":
DRT.start();
break;
case "dromaeo:progress":
DRT.progress(event.data);
break;
case "dromaeo:alldone":
DRT.teardown(event.data);
break;
}
});
},
Record subtest values in Dromaeo tests https://bugs.webkit.org/show_bug.cgi?id=124498 Reviewed by Andreas Kling. PerformanceTests: Made Dromaeo's test runner report values in DRT.progress via newly added PerfTestRunner.reportValues. * Dromaeo/resources/dromaeorunner.js: (.): Moved the definition out of DRT.setup. (DRT.setup): Ditto. (DRT.testObject): Extracted from DRT.setup. Set the subtest name and continueTesting. continueTesting is set true for subtests; i.e. when name is specified. (DRT.progress): Call PerfTestRunner.reportValues to report subtest results. (DRT.teardown): Call PerfTestRunner.reportValues instead of measureValueAsync. * resources/runner.js: Made various changes for newly added PerfTestRunner.reportValues. (.): Moved the initialization of completedIterations, results, jsHeapResults, and mallocHeapResults into start since they need to be initialized before running each subtest. Initialize logLines here since we need to use the same logger for all subtests. (.start): Initialize the variables mentioned above here. Also respect doNotLogStart used by reportValues. (ignoreWarmUpAndLog): Added doNotLogProgress. Used by reportValues since it reports all values at once. (finish): Compute the metric name such as FrameFrame and Runs from unit. Also don't log or notify done when continueTesting is set on the test object. (PerfTestRunner.reportValues): Added. Reports all values for the main/sub test. Tools: Supported parsing subtest results. * Scripts/webkitpy/performance_tests/perftest.py: Replaced _metrics with an ordered list of subtests, each of which contains a dictionary with its name and an ordered list of subtest's metrics. (PerfTest.__init__): Initialize _metrics as a list. (PerfTest.run): Go through each subtest and its metrics to create a list of TestMetrics. (PerfTest._run_with_driver): (PerfTest._ensure_metrics): Look for a subtest then a metric in _metrics. * Scripts/webkitpy/performance_tests/perftest_unittest.py: (TestPerfTest._assert_results_are_correct): Updated the assertions per changes to _metrics. (TestPerfTest.test_parse_output): Ditto. (TestPerfTest.test_parse_output_with_subtests): Added the metric and the unit on each subtest result as well as assertions to ensure subtest results are parsed properly. (TestReplayPerfTest.test_run_with_driver_accumulates_results): Updated the assertions per changes to _metrics. (TestReplayPerfTest.test_run_with_driver_accumulates_memory_results): Dittp. * Scripts/webkitpy/performance_tests/perftestsrunner.py: (_generate_results_dict): When the metric for a subtest is processed before that of the main test, the url is incorrectly suffixed with '/'. Fix this later by re-computing the url with TestPerfMetric.test_file_name when adding new results. * Scripts/webkitpy/performance_tests/perftestsrunner_integrationtest.py: (TestWithSubtestsData): Added. (TestDriver.run_test): (MainTest.test_run_test_with_subtests): Added. LayoutTests: Rebaselined the test. * fast/harness/perftests/runs-per-second-log-expected.txt: Canonical link: https://commits.webkit.org/143056@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@159805 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2013-11-27 05:33:17 +00:00
testObject: function(name) {
Make DoYouEvenBench runnable by run-perf-tests https://bugs.webkit.org/show_bug.cgi?id=127030 Reviewed by Andreas Kling. PerformanceTests: Added Full.html that runs 5 iterations of DoYouEvenBench. This is the canonical DoYouEvenBench, which is also runnable by run-perf-tests. * DoYouEvenBench/Full.html: Added. * DoYouEvenBench/benchmark.html: (startTest): Updated the code to account for the fact old measuredValues is pushed down to tests property and we now have total property so that we don't have to manually compute the total. * DoYouEvenBench/resources/benchmark-report.js: Added. When we're inside a DRT/WTR, use PerfTestRunner to output that can be parsed by run-perf-tests. Do the same when the query part or the fragment part of the current URL is "webkit" for debugging purposes. * DoYouEvenBench/resources/benchmark-runner.js: (BenchmarkRunner): (BenchmarkRunner.prototype._appendFrame): Position the frame at (0, 0) inside DRT and WTR since we have exactly 800px by 600px inside those two test runners. Also always insert the iframe as the first child of body to avoid inserting it after the pre inserted by the test runner. (BenchmarkRunner.prototype.step): Initializes _measuredValues. (BenchmarkRunner.prototype.runAllSteps): Merged callNextStep in benchmark.html. (BenchmarkRunner.prototype.runMultipleIterations): Added. (BenchmarkRunner.prototype._runTestAndRecordResults): Compute the grand total among suites. Also push down the sync and async time into tests property for consistency. (BenchmarkRunner.prototype._finalize): * Dromaeo/resources/dromaeorunner.js: (DRT.testObject): Renamed dromaeoIterationCount to customIterationCount as this option is also used by DoYouEvenBench. * resources/runner.js: Ditto. (.finish): Spit out the aggregator name. Tools: Ignore console messages spit out by DoYouEvenBench and support aggregator names such as ":Total" to appear at the end of a test name. We don't do anything with it for now. * Scripts/webkitpy/performance_tests/perftest.py: (PerfTest._metrics_regex): Handle aggregator names such as ":Total". We'll pass it down to the JSON in a follow up patch for the perf dashboard. (PerfTest._lines_to_ignore_in_parser_result): Added lines to ignore for DoYouEvenBench. LayoutTests: Use customIterationCount as it has been renamed from dromaeoIterationCount. * fast/harness/perftests/runs-per-second-iterations.html: Canonical link: https://commits.webkit.org/145035@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@162058 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2014-01-15 08:01:52 +00:00
return {customIterationCount: ITERATION_COUNT, doNotMeasureMemoryUsage: true, doNotIgnoreInitialRun: true, unit: 'runs/s',
Record subtest values in Dromaeo tests https://bugs.webkit.org/show_bug.cgi?id=124498 Reviewed by Andreas Kling. PerformanceTests: Made Dromaeo's test runner report values in DRT.progress via newly added PerfTestRunner.reportValues. * Dromaeo/resources/dromaeorunner.js: (.): Moved the definition out of DRT.setup. (DRT.setup): Ditto. (DRT.testObject): Extracted from DRT.setup. Set the subtest name and continueTesting. continueTesting is set true for subtests; i.e. when name is specified. (DRT.progress): Call PerfTestRunner.reportValues to report subtest results. (DRT.teardown): Call PerfTestRunner.reportValues instead of measureValueAsync. * resources/runner.js: Made various changes for newly added PerfTestRunner.reportValues. (.): Moved the initialization of completedIterations, results, jsHeapResults, and mallocHeapResults into start since they need to be initialized before running each subtest. Initialize logLines here since we need to use the same logger for all subtests. (.start): Initialize the variables mentioned above here. Also respect doNotLogStart used by reportValues. (ignoreWarmUpAndLog): Added doNotLogProgress. Used by reportValues since it reports all values at once. (finish): Compute the metric name such as FrameFrame and Runs from unit. Also don't log or notify done when continueTesting is set on the test object. (PerfTestRunner.reportValues): Added. Reports all values for the main/sub test. Tools: Supported parsing subtest results. * Scripts/webkitpy/performance_tests/perftest.py: Replaced _metrics with an ordered list of subtests, each of which contains a dictionary with its name and an ordered list of subtest's metrics. (PerfTest.__init__): Initialize _metrics as a list. (PerfTest.run): Go through each subtest and its metrics to create a list of TestMetrics. (PerfTest._run_with_driver): (PerfTest._ensure_metrics): Look for a subtest then a metric in _metrics. * Scripts/webkitpy/performance_tests/perftest_unittest.py: (TestPerfTest._assert_results_are_correct): Updated the assertions per changes to _metrics. (TestPerfTest.test_parse_output): Ditto. (TestPerfTest.test_parse_output_with_subtests): Added the metric and the unit on each subtest result as well as assertions to ensure subtest results are parsed properly. (TestReplayPerfTest.test_run_with_driver_accumulates_results): Updated the assertions per changes to _metrics. (TestReplayPerfTest.test_run_with_driver_accumulates_memory_results): Dittp. * Scripts/webkitpy/performance_tests/perftestsrunner.py: (_generate_results_dict): When the metric for a subtest is processed before that of the main test, the url is incorrectly suffixed with '/'. Fix this later by re-computing the url with TestPerfMetric.test_file_name when adding new results. * Scripts/webkitpy/performance_tests/perftestsrunner_integrationtest.py: (TestWithSubtestsData): Added. (TestDriver.run_test): (MainTest.test_run_test_with_subtests): Added. LayoutTests: Rebaselined the test. * fast/harness/perftests/runs-per-second-log-expected.txt: Canonical link: https://commits.webkit.org/143056@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@159805 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2013-11-27 05:33:17 +00:00
name: name, continueTesting: !!name};
},
start: function() {
DRT.targetWindow.postMessage({ name: "dromaeo:start" } , "*");
},
progress: function(message) {
Simplify and reformat the output of performance tests inside test runners https://bugs.webkit.org/show_bug.cgi?id=124496 Reviewed by Antti Koivisto. PerformanceTests: As a preparation to support subtests for Dromaeo and DoYouEvenBench, simplify the output performance tests generate. Also modernize the output to better support "metric" concept we introduced a while ago. New output on Dromaeo/dom-attr looks like this: ----------------------------------------------- Running 5 times getAttribute -> [1105, 1108, 1134, 1137, 1154] element.property -> [1634, 1655, 1685, 1696, 1723] setAttribute -> [646.3536463536464, 651, 651, 656.3436563436563, 658] element.property = value -> [934, 949, 963, 964, 974] element.expando = value -> [419, 419.5804195804196, 421.57842157842157, 425.57442557442556, 429] element.expando -> [501, 517, 519.4805194805194, 521.4785214785214, 525] 1: 117.40644785571585 runs/s 2: 118.84720469666297 runs/s 3: 119.80547640905021 runs/s 4: 120.51886194758805 runs/s 5: 121.51924380569295 runs/s :Time -> [117.40644785571585, 118.84720469666297, 119.80547640905021, 120.51886194758805, 121.51924380569295] runs/s mean: 119.619446942942 runs/s median: 119.80547640905021 runs/s stdev: 1.5769040458730506 runs/s min: 117.40644785571585 runs/s max: 121.51924380569295 runs/s ----------------------------------------------- * Dromaeo/resources/dromaeorunner.js: (DRT.progress): Use the new format for subtest reports. * resources/runner.js: (.): Declare verboseLogging, which is set to true outside of test runners. (PerfTestRunner.logInfo): Use verboseLogging instead of directly checking window.testRunner. (PerfTestRunner.logDetail): Added. Logs informative text with a label such as "mean: 123 s" with 4-space indentation. (PerfTestRunner.logStatistics): Use logDetail. (.start): Initialize verboseLogging. Also log "Running 20 times" as an informative log using logDetail. (.ignoreWarmUpAndLog): Use logDetail for showing the progress. These logs were useless inside test runners anyway because perftest didn't get to see any output until the test finished running. (.finish): Call logStatistics with metric name as opposed to a label. Each metric name is now prefixed with ':' to be distinguishable from subtests, making the new format forward compatible. Tools: As a preparation to support subtests for Dromaeo and DoYouEvenBench, simplify the output performance tests generate. Instead of spitting out noise in PerfTestRunner (runner.js) and ignoring it in PerfTest._filter_output (perftest.py), simply avoid generating it in the first place. Also modernize the output to adopt "metric" concept better and make it forward compatible with subtests. With this patch, performance tests written using runner.js only produces empty lines or lines of the following format inside test runners (DumpRenderTree and WebKitTestRunner): <subtest name> -> [<value 1>, <value 2>, ...] :<metric name> -> [<value 1>, <value 2>, ...] This greatly simplifies the parsing logic inside PerfTest._run_with_driver. * Scripts/webkitpy/performance_tests/perftest.py: (PerfTest): Removed a bunch of regular expressions that are no longer used. (PerfTest._run_with_driver): Just parse the values and description and treat everything else as errors. * Scripts/webkitpy/performance_tests/perftest_unittest.py: (TestPerfTest.test_parse_output): Removed the junk. (TestPerfTest._assert_failed_on_line): Extracted from test_parse_output_with_failing_line, which was removed in favor of the tests below. (TestPerfTest.test_parse_output_with_running_five_times): Added. (TestPerfTest.test_parse_output_with_detailed_info): Added. (TestPerfTest.test_parse_output_with_statistics): Added. (TestPerfTest.test_parse_output_with_description): Removed the junk. (TestPerfTest.test_parse_output_with_subtests): Ditto. (TestSingleProcessPerfTest.test_use_only_one_process): Ditto. * Scripts/webkitpy/performance_tests/perftestsrunner_integrationtest.py: (EventTargetWrapperTestData): Ditto. (SomeParserTestData): Ditto. (MemoryTestData): Ditto. LayoutTests: Rebaseline the expected result now that the output has been simplified. * fast/harness/perftests/runs-per-second-log-expected.txt: Canonical link: https://commits.webkit.org/142730@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@159465 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2013-11-18 23:40:36 +00:00
var score = message.status.score;
if (score)
Record subtest values in Dromaeo tests https://bugs.webkit.org/show_bug.cgi?id=124498 Reviewed by Andreas Kling. PerformanceTests: Made Dromaeo's test runner report values in DRT.progress via newly added PerfTestRunner.reportValues. * Dromaeo/resources/dromaeorunner.js: (.): Moved the definition out of DRT.setup. (DRT.setup): Ditto. (DRT.testObject): Extracted from DRT.setup. Set the subtest name and continueTesting. continueTesting is set true for subtests; i.e. when name is specified. (DRT.progress): Call PerfTestRunner.reportValues to report subtest results. (DRT.teardown): Call PerfTestRunner.reportValues instead of measureValueAsync. * resources/runner.js: Made various changes for newly added PerfTestRunner.reportValues. (.): Moved the initialization of completedIterations, results, jsHeapResults, and mallocHeapResults into start since they need to be initialized before running each subtest. Initialize logLines here since we need to use the same logger for all subtests. (.start): Initialize the variables mentioned above here. Also respect doNotLogStart used by reportValues. (ignoreWarmUpAndLog): Added doNotLogProgress. Used by reportValues since it reports all values at once. (finish): Compute the metric name such as FrameFrame and Runs from unit. Also don't log or notify done when continueTesting is set on the test object. (PerfTestRunner.reportValues): Added. Reports all values for the main/sub test. Tools: Supported parsing subtest results. * Scripts/webkitpy/performance_tests/perftest.py: Replaced _metrics with an ordered list of subtests, each of which contains a dictionary with its name and an ordered list of subtest's metrics. (PerfTest.__init__): Initialize _metrics as a list. (PerfTest.run): Go through each subtest and its metrics to create a list of TestMetrics. (PerfTest._run_with_driver): (PerfTest._ensure_metrics): Look for a subtest then a metric in _metrics. * Scripts/webkitpy/performance_tests/perftest_unittest.py: (TestPerfTest._assert_results_are_correct): Updated the assertions per changes to _metrics. (TestPerfTest.test_parse_output): Ditto. (TestPerfTest.test_parse_output_with_subtests): Added the metric and the unit on each subtest result as well as assertions to ensure subtest results are parsed properly. (TestReplayPerfTest.test_run_with_driver_accumulates_results): Updated the assertions per changes to _metrics. (TestReplayPerfTest.test_run_with_driver_accumulates_memory_results): Dittp. * Scripts/webkitpy/performance_tests/perftestsrunner.py: (_generate_results_dict): When the metric for a subtest is processed before that of the main test, the url is incorrectly suffixed with '/'. Fix this later by re-computing the url with TestPerfMetric.test_file_name when adding new results. * Scripts/webkitpy/performance_tests/perftestsrunner_integrationtest.py: (TestWithSubtestsData): Added. (TestDriver.run_test): (MainTest.test_run_test_with_subtests): Added. LayoutTests: Rebaselined the test. * fast/harness/perftests/runs-per-second-log-expected.txt: Canonical link: https://commits.webkit.org/143056@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@159805 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2013-11-27 05:33:17 +00:00
PerfTestRunner.reportValues(this.testObject(score.name), score.times);
},
teardown: function(data) {
PerformanceTests: Dromaeo should report individual test result https://bugs.webkit.org/show_bug.cgi?id=99800 Reviewed by Eric Seidel. Made one small modification to Droameo's webrunner.js so that it reports individual runs/s values for each subtest. This allows us to compute the aggregated run/s for each iteration like other performance tests. Also stop measuring memory usage in Dromaeo tests because some Dromaeo tests (e.g. jslib-modify-jquery) have unrealistic memory usage, and measuring them at the time of teardown doesn't make much sense. * Animation/balls.html: Fixed typo: measureValueAync. * Dromaeo/resources/dromaeo/web/webrunner.js: * Dromaeo/resources/dromaeorunner.js: (DRT.setup): Call prepareToMeasureValuesAsync so that DRT.teardown can use meausreValueAsync, and log "Running 5 times". Since the log container will be inserted before iframe, we need to explicitly insert the iframe as the first child of the body element to avoid logs from affecting the iframe's position. Also specify the number of iterations by calling PerfTestRunner.iterationCount() so that we may adjust the number of iterations in PerfTestRunner. (DRT.progress): Log individual measurement for each subtest. (DRT.teardown): Compute the aggregated result for each iteration, and log them using measureValueAsync. * resources/runner.js: (PerfTestRunner.logStatistics): Merged printStatistics since it's no longer needed after r131638. (PerfTestRunner): Removed getAndPrintMemoryStatistics since it was used only in Dromaeo tests but we no longer measure memory usage in Dromaeo tests. (start): Increment completedRuns from -1 to 0 for Dromaeo tests where we don't want to ignore the initial measurement. Note that ignoreWarmUpAndLog ignores the measurements for which completedRuns is negative. (ignoreWarmUpAndLog): We don't measure memory usage in Dromaeo tests. See above. (PerfTestRunner.iterationCount): Added. This abstraction allows us to auto-adjust the number of iterations from run-perf-tests in near future. (PerfTestRunner.measureValueAsync): Renamed from measureValueAync. Tools: Dromaeo should report individual test result https://bugs.webkit.org/show_bug.cgi?id=99800 Reviewed by Eric Seidel. Ignore subtest results spit out by Dromaeo tests. * Scripts/webkitpy/performance_tests/perftest.py: (PerfTest): Added a line to ignore. * Scripts/webkitpy/performance_tests/perftest_unittest.py: (MainTest.test_parse_output_with_subtests): Added. LayoutTests: Fix a test and re-enable fast/harness/perftests on Chromium. * fast/harness/perftests/runs-per-second-log.html: * platform/chromium/TestExpectations: Canonical link: https://commits.webkit.org/122080@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@136492 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2012-12-04 10:00:54 +00:00
PerfTestRunner.log('');
var tests = data.result;
var times = [];
for (var i = 0; i < tests.length; ++i) {
for (var j = 0; j < tests[i].times.length; ++j) {
var runsPerSecond = tests[i].times[j];
times[j] = (times[j] || 0) + 1 / runsPerSecond;
}
}
Record subtest values in Dromaeo tests https://bugs.webkit.org/show_bug.cgi?id=124498 Reviewed by Andreas Kling. PerformanceTests: Made Dromaeo's test runner report values in DRT.progress via newly added PerfTestRunner.reportValues. * Dromaeo/resources/dromaeorunner.js: (.): Moved the definition out of DRT.setup. (DRT.setup): Ditto. (DRT.testObject): Extracted from DRT.setup. Set the subtest name and continueTesting. continueTesting is set true for subtests; i.e. when name is specified. (DRT.progress): Call PerfTestRunner.reportValues to report subtest results. (DRT.teardown): Call PerfTestRunner.reportValues instead of measureValueAsync. * resources/runner.js: Made various changes for newly added PerfTestRunner.reportValues. (.): Moved the initialization of completedIterations, results, jsHeapResults, and mallocHeapResults into start since they need to be initialized before running each subtest. Initialize logLines here since we need to use the same logger for all subtests. (.start): Initialize the variables mentioned above here. Also respect doNotLogStart used by reportValues. (ignoreWarmUpAndLog): Added doNotLogProgress. Used by reportValues since it reports all values at once. (finish): Compute the metric name such as FrameFrame and Runs from unit. Also don't log or notify done when continueTesting is set on the test object. (PerfTestRunner.reportValues): Added. Reports all values for the main/sub test. Tools: Supported parsing subtest results. * Scripts/webkitpy/performance_tests/perftest.py: Replaced _metrics with an ordered list of subtests, each of which contains a dictionary with its name and an ordered list of subtest's metrics. (PerfTest.__init__): Initialize _metrics as a list. (PerfTest.run): Go through each subtest and its metrics to create a list of TestMetrics. (PerfTest._run_with_driver): (PerfTest._ensure_metrics): Look for a subtest then a metric in _metrics. * Scripts/webkitpy/performance_tests/perftest_unittest.py: (TestPerfTest._assert_results_are_correct): Updated the assertions per changes to _metrics. (TestPerfTest.test_parse_output): Ditto. (TestPerfTest.test_parse_output_with_subtests): Added the metric and the unit on each subtest result as well as assertions to ensure subtest results are parsed properly. (TestReplayPerfTest.test_run_with_driver_accumulates_results): Updated the assertions per changes to _metrics. (TestReplayPerfTest.test_run_with_driver_accumulates_memory_results): Dittp. * Scripts/webkitpy/performance_tests/perftestsrunner.py: (_generate_results_dict): When the metric for a subtest is processed before that of the main test, the url is incorrectly suffixed with '/'. Fix this later by re-computing the url with TestPerfMetric.test_file_name when adding new results. * Scripts/webkitpy/performance_tests/perftestsrunner_integrationtest.py: (TestWithSubtestsData): Added. (TestDriver.run_test): (MainTest.test_run_test_with_subtests): Added. LayoutTests: Rebaselined the test. * fast/harness/perftests/runs-per-second-log-expected.txt: Canonical link: https://commits.webkit.org/143056@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@159805 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2013-11-27 05:33:17 +00:00
PerfTestRunner.reportValues(this.testObject(), times.map(function (time) { return 1 / time; }));
},
targetDelegateOf: function(functionName) {
return function() {
DRT.targetWindow[functionName].apply(null, arguments);
};
},
log: function(text) {
PerfTestRunner.log(text);
}
};
// These functions are referred from htmlrunner.js
this.startTest = DRT.targetDelegateOf("startTest");
this.test = DRT.targetDelegateOf("test");
this.endTest = DRT.targetDelegateOf("endTest");
this.prep = DRT.targetDelegateOf("prep");
window.DRT = DRT;
})();