haikuwebkit/PerformanceTests/MotionMark/developer.html

202 lines
11 KiB
HTML
Raw Permalink Normal View History

Add license for MotionMark https://bugs.webkit.org/show_bug.cgi?id=179222 Reviewed by Ryosuke Niwa. * MotionMark/about.html: * MotionMark/developer.html: * MotionMark/index.html: * MotionMark/resources/debug-runner/graph.js: * MotionMark/resources/debug-runner/motionmark.css: * MotionMark/resources/debug-runner/motionmark.js: * MotionMark/resources/debug-runner/tests.js: * MotionMark/resources/extensions.js: * MotionMark/resources/statistics.js: * MotionMark/resources/strings.js: * MotionMark/tests/3d/resources/webgl.js: * MotionMark/tests/3d/webgl.html: * MotionMark/tests/bouncing-particles/bouncing-canvas-images.html: * MotionMark/tests/bouncing-particles/bouncing-canvas-shapes.html: * MotionMark/tests/bouncing-particles/bouncing-css-images.html: * MotionMark/tests/bouncing-particles/bouncing-css-shapes.html: * MotionMark/tests/bouncing-particles/bouncing-svg-images.html: * MotionMark/tests/bouncing-particles/bouncing-svg-shapes.html: * MotionMark/tests/bouncing-particles/bouncing-tagged-images.html: * MotionMark/tests/bouncing-particles/resources/bouncing-canvas-images.js: * MotionMark/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * MotionMark/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * MotionMark/tests/bouncing-particles/resources/bouncing-css-images.js: * MotionMark/tests/bouncing-particles/resources/bouncing-css-shapes.js: * MotionMark/tests/bouncing-particles/resources/bouncing-particles.js: * MotionMark/tests/bouncing-particles/resources/bouncing-svg-images.js: * MotionMark/tests/bouncing-particles/resources/bouncing-svg-particles.js: * MotionMark/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * MotionMark/tests/bouncing-particles/resources/bouncing-tagged-images.js: * MotionMark/tests/dom/compositing-transforms.html: * MotionMark/tests/dom/focus.html: * MotionMark/tests/dom/leaves.html: * MotionMark/tests/dom/particles.html: * MotionMark/tests/dom/resources/compositing-transforms.js: * MotionMark/tests/dom/resources/dom-particles.js: * MotionMark/tests/dom/resources/focus.js: * MotionMark/tests/dom/resources/leaves.js: * MotionMark/tests/master/canvas-stage.html: * MotionMark/tests/master/focus.html: * MotionMark/tests/master/image-data.html: * MotionMark/tests/master/leaves.html: * MotionMark/tests/master/multiply.html: * MotionMark/tests/master/resources/canvas-stage.js: * MotionMark/tests/master/resources/canvas-tests.js: * MotionMark/tests/master/resources/focus.js: * MotionMark/tests/master/resources/image-data.js: * MotionMark/tests/master/resources/leaves.js: * MotionMark/tests/master/resources/multiply.js: * MotionMark/tests/master/resources/particles.js: * MotionMark/tests/master/resources/svg-particles.js: * MotionMark/tests/master/resources/text.js: * MotionMark/tests/master/svg-particles.html: * MotionMark/tests/master/text.html: * MotionMark/tests/resources/main.js: * MotionMark/tests/resources/math.js: * MotionMark/tests/resources/stage.css: * MotionMark/tests/simple/resources/simple-canvas-paths.js: * MotionMark/tests/simple/resources/simple-canvas.js: * MotionMark/tests/simple/resources/tiled-canvas-image.js: * MotionMark/tests/simple/simple-canvas-paths.html: * MotionMark/tests/simple/tiled-canvas-image.html: * MotionMark/tests/template/resources/template-canvas.js: * MotionMark/tests/template/resources/template-css.js: * MotionMark/tests/template/resources/template-svg.js: * MotionMark/tests/template/template-canvas.html: * MotionMark/tests/template/template-css.html: * MotionMark/tests/template/template-svg.html: Canonical link: https://commits.webkit.org/195326@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@224388 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2017-11-03 06:20:22 +00:00
<!--
MotionMark: ensure that timestamps are valid during warm up phase of tests https://bugs.webkit.org/show_bug.cgi?id=210640 Reviewed by Said Abou-Hallawa. Ensure that Benchmark._benchmarkStartTimestamp is set during warm up phase. Otherwise it is NaN, which makes the Benchmark.timestamp invalid, which is used by tests like Multiply to drive the animation. When the warm up phase completes, the start timestamp is reset. Update minor version of benchmark with this bug fix, and include version changelog in the about page. For testing, add a parameter that allows for adjusting the length of the warm up phase. It remains at its current default, 100 ms. * MotionMark/about.html: Add section of version changelog. Includes links to webkit.org blog posts. * MotionMark/developer.html: Add parameter for setting warmup length. Remove the Kalman filter parameters, since they should always be fixed. * MotionMark/resources/runner/motionmark.css: Include styles to show version log. * MotionMark/resources/runner/motionmark.js: Factor out default options to a property on window.benchmarkController. Include the default warmup length of 100 ms. (window.benchmarkController.startBenchmark): Refactor to use benchmarkDefaultParameters. * MotionMark/resources/debug-runner/motionmark.js: Ditto. * MotionMark/resources/strings.js: Update version number. * MotionMark/tests/resources/main.js: (_animateLoop): Set _benchmarkTimestamp during the warmup phase. Check the warmup length. The _benchmarkTimestamp variable remains reset when the test begins. Canonical link: https://commits.webkit.org/223865@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@260656 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2020-04-24 17:58:55 +00:00
Copyright (C) 2015-2020 Apple Inc. All rights reserved.
Add license for MotionMark https://bugs.webkit.org/show_bug.cgi?id=179222 Reviewed by Ryosuke Niwa. * MotionMark/about.html: * MotionMark/developer.html: * MotionMark/index.html: * MotionMark/resources/debug-runner/graph.js: * MotionMark/resources/debug-runner/motionmark.css: * MotionMark/resources/debug-runner/motionmark.js: * MotionMark/resources/debug-runner/tests.js: * MotionMark/resources/extensions.js: * MotionMark/resources/statistics.js: * MotionMark/resources/strings.js: * MotionMark/tests/3d/resources/webgl.js: * MotionMark/tests/3d/webgl.html: * MotionMark/tests/bouncing-particles/bouncing-canvas-images.html: * MotionMark/tests/bouncing-particles/bouncing-canvas-shapes.html: * MotionMark/tests/bouncing-particles/bouncing-css-images.html: * MotionMark/tests/bouncing-particles/bouncing-css-shapes.html: * MotionMark/tests/bouncing-particles/bouncing-svg-images.html: * MotionMark/tests/bouncing-particles/bouncing-svg-shapes.html: * MotionMark/tests/bouncing-particles/bouncing-tagged-images.html: * MotionMark/tests/bouncing-particles/resources/bouncing-canvas-images.js: * MotionMark/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * MotionMark/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * MotionMark/tests/bouncing-particles/resources/bouncing-css-images.js: * MotionMark/tests/bouncing-particles/resources/bouncing-css-shapes.js: * MotionMark/tests/bouncing-particles/resources/bouncing-particles.js: * MotionMark/tests/bouncing-particles/resources/bouncing-svg-images.js: * MotionMark/tests/bouncing-particles/resources/bouncing-svg-particles.js: * MotionMark/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * MotionMark/tests/bouncing-particles/resources/bouncing-tagged-images.js: * MotionMark/tests/dom/compositing-transforms.html: * MotionMark/tests/dom/focus.html: * MotionMark/tests/dom/leaves.html: * MotionMark/tests/dom/particles.html: * MotionMark/tests/dom/resources/compositing-transforms.js: * MotionMark/tests/dom/resources/dom-particles.js: * MotionMark/tests/dom/resources/focus.js: * MotionMark/tests/dom/resources/leaves.js: * MotionMark/tests/master/canvas-stage.html: * MotionMark/tests/master/focus.html: * MotionMark/tests/master/image-data.html: * MotionMark/tests/master/leaves.html: * MotionMark/tests/master/multiply.html: * MotionMark/tests/master/resources/canvas-stage.js: * MotionMark/tests/master/resources/canvas-tests.js: * MotionMark/tests/master/resources/focus.js: * MotionMark/tests/master/resources/image-data.js: * MotionMark/tests/master/resources/leaves.js: * MotionMark/tests/master/resources/multiply.js: * MotionMark/tests/master/resources/particles.js: * MotionMark/tests/master/resources/svg-particles.js: * MotionMark/tests/master/resources/text.js: * MotionMark/tests/master/svg-particles.html: * MotionMark/tests/master/text.html: * MotionMark/tests/resources/main.js: * MotionMark/tests/resources/math.js: * MotionMark/tests/resources/stage.css: * MotionMark/tests/simple/resources/simple-canvas-paths.js: * MotionMark/tests/simple/resources/simple-canvas.js: * MotionMark/tests/simple/resources/tiled-canvas-image.js: * MotionMark/tests/simple/simple-canvas-paths.html: * MotionMark/tests/simple/tiled-canvas-image.html: * MotionMark/tests/template/resources/template-canvas.js: * MotionMark/tests/template/resources/template-css.js: * MotionMark/tests/template/resources/template-svg.js: * MotionMark/tests/template/template-canvas.html: * MotionMark/tests/template/template-css.html: * MotionMark/tests/template/template-svg.html: Canonical link: https://commits.webkit.org/195326@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@224388 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2017-11-03 06:20:22 +00:00
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions
are met:
1. Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS''
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS
BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF
THE POSSIBILITY OF SUCH DAMAGE.
-->
Split benchmark into two different pages https://bugs.webkit.org/show_bug.cgi?id=152458 Reviewed by Simon Fraser. Address comments. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): When the testing is complete the frame owning the sampler goes away, and a later call to get the JSON data is no longer available. Process the data right here, instead, and just reference it when displaying the results in ResultsDashboard.prototype._processData. * Animometer/resources/extensions.js: (Array.prototype.fill.Array.prototype.fill): Add a null check. Remove braces around single-line clause. (Array.prototype.find.Array.prototype.find): Update the null check. (ResultsDashboard.prototype._processData): Use the already-processed data. * Animometer/resources/runner/animometer.css: (.frame-container > iframe): Remove calc(). Move Array functions to extensions.js since that is included by the harness. Add ES6 Array polyfills. * Animometer/resources/algorithm.js: (Array.prototype.swap): Moved to extensions.js. * Animometer/resources/extensions.js: (Array.prototype.swap): (Array.prototype.fill): Added. (Array.prototype.find): Added. Adjust styles for iPad. * Animometer/resources/runner/animometer.css: (@media screen and (min-device-width: 768px)): Apply to iPad as well. (@media screen and (max-device-width: 1024px)): Update width for iPads. Adjustment styles for iOS. * Animometer/developer.html: Different divs contain the iframe, so use a class instead and update the style rules. * Animometer/index.html: * Animometer/resources/debug-runner/animometer.css: Remove extraneous rules. (@media screen and (min-device-width: 1800px)): Move this up. * Animometer/resources/runner/animometer.css: Add rules to accomodate iOS. Get rid of prefixed flex properties for now. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/runner/animometer.css: Update the structure of the harness. Remove the JSON-per-test but keep the JSON of the whole test run. Use the full page in order to display the graph. * Animometer/developer.html: Update several of the JS file includes to UTF-8. Remove header and footer. Test results screen includes score, average, and worst 5% statistics. * Animometer/index.html: Make structure similar to developer.html. * Animometer/resources/debug-runner/animometer.css: Remove most of the button rules since they are superfluous. Move the progress bar to the top, fixed. Update the results page rules. * Animometer/resources/debug-runner/animometer.js: Remove most of the additions to sectionsManager since they are no longer needed. (setSectionHeader): Updates header of the section. (window.suitesManager._updateStartButtonState): Update selector. (showResults): Add keypress event for selecting different data for copy/paste. Update how the results are populated. Include full test JSON in a textarea, rather than requiring a button press. (showTestGraph): * Animometer/resources/debug-runner/tests.js: Update structure of Headers. Define different kinds of headers. Headers can control their title, and the text used as the cell contents, including class name. * Animometer/resources/extensions.js: (ResultsTable): Update to include a flattened version of the headers, used while populating table contents. Remove unneeded helper functions for creating the table. Rename "show" to "add". * Animometer/resources/runner/animometer.css: Update rules to accommodate the new structure. * Animometer/resources/runner/animometer.js: (window.sectionsManager.setSectionScore): Helper function to set the score and mean for a section. (window.sectionsManager.populateTable): Helper function to set the table. (window.benchmarkController.showResults): Refactor. (window.benchmarkController.selectResults): Update selectors. * Animometer/resources/runner/tests.js: Set Headers. Debug harness extends it. Update debug runner to have similar names to the basic runner. Include that page's CSS and remove extraneous CSS rules. Get rid of the statistics table #record. * Animometer/developer.html: Rename #home to #intro. Rename .spacer to hr. * Animometer/resources/debug-runner/animometer.css: Set to flexbox when selected. * Animometer/resources/debug-runner/animometer.js: Remove recordTable. (window.suitesManager._updateStartButtonState): Update selector to #intro. (setupRunningSectionStyle): Deleted. * Animometer/resources/runner/animometer.css: (#test-container.selected): Change to flex-box only when visible. Remove recordTable. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): * Animometer/resources/runner/tests.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * Animometer/tests/examples/resources/canvas-electrons.js: * Animometer/tests/examples/resources/canvas-stars.js: * Animometer/tests/misc/resources/compositing-transforms.js: * Animometer/tests/resources/main.js: * Animometer/tests/resources/stage.js: (StageBenchmark): Remove _recordTable. * Animometer/tests/simple/resources/simple-canvas-paths.js: * Animometer/tests/simple/resources/simple-canvas.js: * Animometer/tests/template/resources/template-canvas.js: * Animometer/tests/template/resources/template-css.js: * Animometer/tests/template/resources/template-svg.js: * Animometer/tests/text/resources/layering-text.js: * Animometer/resources/debug-runner/animometer.js: (willStartFirstIteration): Fix selector, since results-table is used in multiple places, so it cannot be an id. Make it possible to select the scores, or the whole table data, by cycling through different selections through key press of 's'. * Animometer/resources/runner/animometer.js: (window.benchmarkController.showResults): Attach a keypress handler if it hasn't been added already. (window.benchmarkController.selectResults): * Animometer/resources/runner/tests.js: Cycle through different ranges. Fix a few fly-by errors. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunnerState.prototype.prepareCurrentTest): Update the frame relative path since the files are now in the top directory instead of inside runner/. (BenchmarkRunner.prototype._runTestAndRecordResults): Incorrect reference to function. (BenchmarkRunner.prototype.step): Member variable is never used. A little stylistic cleanup. * Animometer/resources/debug-runner/benchmark-runner.js: * Animometer/resources/extensions.js: (window.DocumentExtension.createElement): * Animometer/tests/resources/main.js: (Benchmark.prototype.record): * Animometer/tests/resources/stage.js: (StageBenchmark.prototype.showResults): Reverse progress and message. The message appears less frequently than the progress. * Animometer/tests/simple/resources/simple-canvas.js: (SimpleCanvasBenchmark): Remove unused options. Add newer version of harness in a new page. Consolidate differences between the two pages. * Animometer/developer.html: Include runner/animometer.js. Rename the JS function to run the benchmark to startBenchmark() instead of startTest(). Rename #running to #test-container. * Animometer/index.html: Added. Similarly calls startBenchmark() and has #test-container. * Animometer/resources/debug-runner/animometer.css: Make the canvas 2:1 (1200px x 800px) instead of 4:3. Split out benchmarkRunnerClient and benchmarkController. * Animometer/resources/debug-runner/animometer.js: Move needed functions out of benchmarkRunnerClient, and leave the rest here to extend that object. Get rid of _resultsTable and move populating the results table into benchmarkController. Rename _resultsDashboard to results and make it accessible for other objects to use. (willAddTestFrame): This is unnecessary. Remove. (window.sectionsManager.showScore): Grab it from the results object instead of benchmarkRunnerClient. (window.sectionsManager.showSection): Deleted. Moved to runner/animometer.js. (window.benchmarkController._runBenchmark): Deleted. Mostly moved into _startBenchmark. (window.benchmarkController.startBenchmark): Refactor to call _startBenchmark. (window.benchmarkController.showResults): Include most of benchmarkRunnerClient.didFinishLastIteration() here. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._appendFrame): Remove unneeded call to willAddTestFrame. * Animometer/resources/extensions.js: (ResultsDashboard): Change the class to process the sampler data on-demand and hold onto that data for later referencing. (ResultsDashboard.prototype.toJSON): Deleted. (ResultsDashboard.prototype._processData): Rename toJSON to _processData since it's not really outputting JSON. Store the processed data into a member variable that can be referenced later. (ResultsDashboard.prototype.get data): Process the data if it hasn't already. (ResultsDashboard.prototype.get score): Process the data if it hasn't already, then return the aggregate score. (ResultsTable.prototype._showHeader): When outputting the results to a table, don't force the need for an empty children array. This was to allow for a header row in the table that spanned multiple columns. In the simpler harness, this is not needed. (ResultsTable.prototype._showEmptyCells): (ResultsTable.prototype._showTest): This hardcoded the columns. At least for the name and score, which is the bare minimum needed for the simpler harness, key off of the header name provided. * Animometer/resources/runner/animometer.css: Added. Use a similar 2:1 ratio. The score tables are split into the data and the headers, and are also displayed RTL so that a later patch allows a user to copy-paste the data easily. * Animometer/resources/runner/animometer.js: Added. Use a simpler version of benchmarkRunnerClient. The debug harness will extend these classes. (window.benchmarkController._startBenchmark): Used by both harnesses. (window.benchmarkController.startBenchmark): Set hard-coded options. (window.benchmarkController.showResults): Includes most of benchmarkRunnerClient.didFinishLastIteration() here. Get rid of utilities.js. Move it all into extensions.js. * Animometer/resources/extensions.js: * Animometer/tests/resources/utilities.js: Removed. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: Remove script link. * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: Ditto. * Animometer/tests/examples/canvas-electrons.html: Ditto. * Animometer/tests/examples/canvas-stars.html: Ditto. * Animometer/tests/misc/compositing-transforms.html: Ditto. * Animometer/tests/simple/simple-canvas-paths.html: Ditto. * Animometer/tests/template/template-canvas.html: Ditto. * Animometer/tests/template/template-css.html: Ditto. * Animometer/tests/template/template-svg.html: Ditto. * Animometer/tests/text/layering-text.html: Ditto. Split tests.js into two. Add a new suite to runner/tests.js. * Animometer/developer.html: Update the script order. Scripts from debug-runner/ will always build on those from runner/, and have the same name. * Animometer/resources/debug-runner/tests.js: Move "complex examples" suite into "miscellaneous tests". (Suite): Deleted. (Suite.prototype.prepare): Deleted. (Suite.prototype.run): Deleted. (suiteFromName): Deleted. * Animometer/resources/runner/tests.js: Added. Take definitions and functions needed by the test harness. Leave the test suites behind. (Suite): Moved from debug script. (Suite.prototype.prepare): Ditto. (Suite.prototype.run): Ditto. (suiteFromName): Ditto. (testFromName): Ditto. Move benchmark resources out into resources/debug-runner, and update URLs. * Animometer/developer.html: Renamed from PerformanceTests/Animometer/runner/animometer.html. * Animometer/resources/debug-runner/animometer.css: Renamed from PerformanceTests/Animometer/runner/resources/animometer.css. * Animometer/resources/debug-runner/animometer.js: Renamed from PerformanceTests/Animometer/runner/resources/animometer.js. * Animometer/resources/debug-runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/runner/resources/benchmark-runner.js. * Animometer/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/runner/resources/d3.min.js. * Animometer/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/runner/resources/graph.js. * Animometer/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/runner/resources/tests.js. Canonical link: https://commits.webkit.org/170645@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@194407 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2015-12-24 01:31:47 +00:00
<!DOCTYPE html>
<html>
<head>
Update animation benchmark and tests https://bugs.webkit.org/show_bug.cgi?id=154673 Reviewed by Dean Jackson. Update the ramp controller. The controller refines the complexity interval to test across. * Animometer/resources/statistics.js: Add functions that estimate cumulative distribution function. (Regression): For the flat regression, force the first segment to be at 60 fps. (valueAt): Add convenience function to return interpolated value based on the regression used. (_calculateRegression): Include the number of points included for both segments, and the piecewise errors. * Animometer/tests/resources/math.js: Make the Kalman estimator subclass Experiment, and allow it to be reset. * Animometer/tests/resources/main.js: Initialize the tier such that it starts at 10^0 = 1. Increase the number of ramps. Maintain three FPS thresholds-- the frame rate of interest, a limit on the lowest FPS we care to go for later interpolation, and a minimum FPS threshold we want to aim for each ramp. Also keep three estimators: a running average of the change point, a minimum boundary for each ramp, and an estimator for all the frames within an interval. The first two are used to determine the parameters of the next ramp, and the latter allows us to refine the parameters. (update): During the tier phase, it is possible that the highest complexity possible for a test won't stress the system enough to trigger stopping the tier phase and transitioning to the ramps. If the complexity doesn't change when going to the next tier, we've maxed the test out, and move on. When the tier phase completed, turn off Controller.frameLengthEstimator, which estimates the FPS at each tier. (tune): At each interval, look at the confidence distribution of being on the 60 FPS side or the slow side. If the slowest FPS we achieve at the ramp's maximum complexity is not at least _fpsRampSlowThreshold, then increase the maximum complexity. If we ever achieve 60 FPS, increase the ramp's minimum complexity to that level. If, at an even lower complexity, a glitch causes the FPS to drop, we reset the minimum complexity. Have the bootstrap calculation occur between tests. Clean up harness. * Animometer/resources/debug-runner/animometer.js: Run bootstrap after a test has completed to avoid doing all of it at the end before showing the results. Clean up parameters being passed around. * Animometer/resources/debug-runner/tests.js: (text): * Animometer/resources/runner/animometer.js: (this._processData.calculateScore): Save the results to the same object holding the data. (this._processData._processData): In the case where a file is dragged, calculate the score serially. Grab the results object and move it to the results variable and remove it from the data object. This avoids serializing the results into the JSON. (this._processData.findRegression): Include the samples used for bootstrapping. Reduce the resample size to shorten the wait. * Animometer/resources/runner/benchmark-runner.js: * Animometer/resources/statistics.js: (bootstrap): Update how bootstrapData is sorted. In some regression results the mix of floats and integers causes an alphabetical sort to occur. * Animometer/resources/strings.js: Add meta charset so that encodings between harness and test match. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: * Animometer/tests/bouncing-particles/bouncing-css-images.html: * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: * Animometer/tests/bouncing-particles/bouncing-svg-images.html: * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: * Animometer/tests/master/canvas-stage.html: * Animometer/tests/master/focus.html: * Animometer/tests/master/image-data.html: * Animometer/tests/master/multiply.html: * Animometer/tests/master/particles.html: * Animometer/tests/misc/canvas-electrons.html: * Animometer/tests/misc/canvas-stars.html: * Animometer/tests/misc/compositing-transforms.html: * Animometer/tests/simple/simple-canvas-paths.html: * Animometer/tests/simple/tiled-canvas-image.html: * Animometer/tests/template/template-canvas.html: * Animometer/tests/template/template-css.html: * Animometer/tests/template/template-svg.html: * Animometer/tests/text/layering-text.html: * Animometer/tests/text/text-boxes.html: Update test harness reporting. * Animometer/developer.html: Add missing meta charset. * Animometer/index.html: Remove unnecessary utf-8 declaration. * Animometer/resources/debug-runner/animometer.css: Add convenience classes for formatting the results table. * Animometer/resources/debug-runner/animometer.js: Adjust which stats are shown. * Animometer/resources/debug-runner/tests.js: Display bootstrapping statistics. * Animometer/resources/strings.js: Move strings not used by the release harness. Switch to a pseudo-random number generator. * Animometer/resources/statistics.js: Add a Pseudo class, with a simple pseudo-random number generator. (_calculateRegression): Reset the generator before running bootstrap. (bootstrap): Deleted. Replace Math.random with Pseudo.random. * Animometer/tests/master/resources/canvas-tests.js: * Animometer/tests/master/resources/focus.js: * Animometer/tests/master/resources/particles.js: * Animometer/tests/resources/main.js: Use bootstrapping to get confidence interval in the breakpoint. For the ramp controller, calculate the piecewise regression, and then use bootstrapping in order to find the 95% confidence interval. Use the raw data. * Animometer/developer.html: Default to the complexity graph. Add a legend checkbox to toggle visibility of the bootstrap score and histogram. * Animometer/resources/debug-runner/animometer.css: Make some more space to show the old raw and average scores in the legend. Add new styles for the data. * Animometer/resources/debug-runner/graph.js: (_addRegressionLine): Allow passing an array for the variance bar tied to the regression line. Now |stdev| is |range|. (createComplexityGraph): Add bootstrap median, and overlay a histogram of the bootstrap samples. Switch raw samples from circles to X's. (onComplexityGraphOptionsChanged): Allow toggling of the bootstrap data. (onGraphTypeChanged): Move the regressions for the raw and average samples to the legend. In the subtitle use the bootstrap median, and include the 95% confidence interval. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Factor out the code that determines which samples to include when calculating the piecewise regression. For series that have many samples, or a wider range of recorded complexities, throw away the 2.5% lowest and highest samples before calculating the regression. Keep all samples if the number of samples to regress is small or the range of complexities is narrow. (this._processData._calculateScore): Factor out regression calculation to findRegression(). Bootstrap the change point of the regression. The score is the median. * Animometer/resources/statistics.js: (_calculateRegression): Correct an issue in the calculation of the regression, where the denominator can be 0. (bootstrap): Template for bootstrapping. Create a bootstrap sample array, Create re-samples by random selection with replacement. Return the 95% confidence samples, the bootstrap median, mean, and the data itself. * Animometer/resources/strings.js: Add bootstrap. * Animometer/tests/resources/main.js: (processSamples): Don't prematurely cut the sample data. Fix graph drawing. * Animometer/resources/debug-runner/animometer.js: Add spacing in the JSON output. Multiple tests output a lot of JSON and can hang when selecting JSON with no whitespace. * Animometer/resources/debug-runner/animometer.css: (#complexity-graph .series.raw circle): Update the color. * Animometer/resources/debug-runner/graph.js: Use the FPS axis instead of the complexity axis, which can vary in domain. For determining the complexity domain, only use samples after samplingTimeOffset. Allow dropping results JSON. * Animometer/developer.html: Add a button. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Read the data and go straight to the dashboard. With JSON output, write out only the options and the raw data. Teach the harness to evaluate the samples and determine the test score. This will allow us to update how the score is calculated separately from the samples recorded. This also prepares the harness to be able to accept JSON of prior runs. * Animometer/resources/strings.js: Clean up and remove unneeded strings and reduce some of the hierarchy. * Animometer/resources/debug-runner/tests.js: Update to use the new strings. * Animometer/tests/resources/main.js: Allow all controllers to show a complexity-FPS graph. (_processComplexitySamples): Factor out some of the sample processing done in the ramp controller for the benefit of the other controllers. |complexitySamples| contains a list of samples. Sort the samples by complexity. Optionally remove the top x% of samples. Group them, and calculate distribution of samples within the same complexity, and add those as new entries into |complexityAverageSamples|. (Controller.processSamples): Move the code responsible for determining the complexity and FPS scores out to ResultsDashboard. The structure of the data returned by the controller is: { controller: [time-regression, time-regression, ...], // optional, data specific to controller marks: [...], samples: { // all of the sample data controller: [...], complexity: [...], // processed from controller samples complexityAverage: [...], // processed from complexity samples } } (AdaptiveController.processSamples): Adding the target frame length is no longer necessary; we now pass the test options to the graph. (Regression): Move to statistics.js. * Animometer/resources/statistics.js: Move Regression to here. Add a check if the sampling range only contains one sample, since we cannot calculate a regression from one sample point. Teach the test harness to evaluate the data. * Animometer/resources/runner/animometer.js: (ResultsDashboard): Store the options used to run the test and the computed results/score separately from the data. The results are stored as: { score: /* geomean of iteration score */, iterationsResults: [ { score: /* geomean of tests */, testsResults: { suiteName: { testName: { controller: { average: concern: stdev: percent: }, frameLength: { ... }, complexity: { complexity: stdev: segment1: segment2: }, complexityAverage: { ... } }, testName: { ... }, }, ... next suite ... } }, { ...next iteration... } ] } * Animometer/resources/debug-runner/animometer.js: Pass options around instead of relying on what was selected in the form. This will later allow for dropping previous results, and using those runs' options when calculating scores. (ResultsTable._addGraphButton): Simplify button action by using attached test data. * Animometer/resources/debug-runner/graph.js: Refactor to use the data. Consolidate JS files, and move statistics out to a separate JS. Preparation for having the Controller only handle recording and storage of the samples, and leave the evaluation of the test score out to the harness. Move Experiment to a new statistics.js, where Regression will also eventually go. Get rid of algorithm.js and move it to utilities.js since the Heap is used only for Experiments. * Animometer/tests/resources/algorithm.js: Removed. Heap is in utilities.js. * Animometer/tests/resources/sampler.js: Removed. Experiment is in statistics.js, Sampler in main.js. * Animometer/tests/resources/main.js: Move Sampler here. * Animometer/resources/statistics.js: Added. Move Statistics and Experiment here. * Animometer/resources/extensions.js: Move Heap here. Attach static method to create a max or min heap to Heap, instead of a new Algorithm object. Update JS files. * Animometer/developer.html: * Animometer/index.html: * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: * Animometer/tests/bouncing-particles/bouncing-css-images.html: * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: * Animometer/tests/bouncing-particles/bouncing-svg-images.html: * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: * Animometer/tests/master/canvas-stage.html: * Animometer/tests/master/focus.html: * Animometer/tests/master/image-data.html: * Animometer/tests/master/multiply.html: * Animometer/tests/master/particles.html: * Animometer/tests/misc/canvas-electrons.html: * Animometer/tests/misc/canvas-stars.html: * Animometer/tests/misc/compositing-transforms.html: * Animometer/tests/simple/simple-canvas-paths.html: * Animometer/tests/simple/tiled-canvas-image.html: * Animometer/tests/template/template-canvas.html: * Animometer/tests/template/template-css.html: * Animometer/tests/template/template-svg.html: * Animometer/tests/text/layering-text.html: * Animometer/tests/text/text-boxes.html: Fix the cursor in the graph analysis when the min complexity is not 0. * Animometer/resources/debug-runner/graph.js: (_addRegression): (createComplexityGraph): Canonical link: https://commits.webkit.org/172863@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@197229 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-02-27 04:32:09 +00:00
<meta charset="utf-8">
Split benchmark into two different pages https://bugs.webkit.org/show_bug.cgi?id=152458 Reviewed by Simon Fraser. Address comments. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): When the testing is complete the frame owning the sampler goes away, and a later call to get the JSON data is no longer available. Process the data right here, instead, and just reference it when displaying the results in ResultsDashboard.prototype._processData. * Animometer/resources/extensions.js: (Array.prototype.fill.Array.prototype.fill): Add a null check. Remove braces around single-line clause. (Array.prototype.find.Array.prototype.find): Update the null check. (ResultsDashboard.prototype._processData): Use the already-processed data. * Animometer/resources/runner/animometer.css: (.frame-container > iframe): Remove calc(). Move Array functions to extensions.js since that is included by the harness. Add ES6 Array polyfills. * Animometer/resources/algorithm.js: (Array.prototype.swap): Moved to extensions.js. * Animometer/resources/extensions.js: (Array.prototype.swap): (Array.prototype.fill): Added. (Array.prototype.find): Added. Adjust styles for iPad. * Animometer/resources/runner/animometer.css: (@media screen and (min-device-width: 768px)): Apply to iPad as well. (@media screen and (max-device-width: 1024px)): Update width for iPads. Adjustment styles for iOS. * Animometer/developer.html: Different divs contain the iframe, so use a class instead and update the style rules. * Animometer/index.html: * Animometer/resources/debug-runner/animometer.css: Remove extraneous rules. (@media screen and (min-device-width: 1800px)): Move this up. * Animometer/resources/runner/animometer.css: Add rules to accomodate iOS. Get rid of prefixed flex properties for now. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/runner/animometer.css: Update the structure of the harness. Remove the JSON-per-test but keep the JSON of the whole test run. Use the full page in order to display the graph. * Animometer/developer.html: Update several of the JS file includes to UTF-8. Remove header and footer. Test results screen includes score, average, and worst 5% statistics. * Animometer/index.html: Make structure similar to developer.html. * Animometer/resources/debug-runner/animometer.css: Remove most of the button rules since they are superfluous. Move the progress bar to the top, fixed. Update the results page rules. * Animometer/resources/debug-runner/animometer.js: Remove most of the additions to sectionsManager since they are no longer needed. (setSectionHeader): Updates header of the section. (window.suitesManager._updateStartButtonState): Update selector. (showResults): Add keypress event for selecting different data for copy/paste. Update how the results are populated. Include full test JSON in a textarea, rather than requiring a button press. (showTestGraph): * Animometer/resources/debug-runner/tests.js: Update structure of Headers. Define different kinds of headers. Headers can control their title, and the text used as the cell contents, including class name. * Animometer/resources/extensions.js: (ResultsTable): Update to include a flattened version of the headers, used while populating table contents. Remove unneeded helper functions for creating the table. Rename "show" to "add". * Animometer/resources/runner/animometer.css: Update rules to accommodate the new structure. * Animometer/resources/runner/animometer.js: (window.sectionsManager.setSectionScore): Helper function to set the score and mean for a section. (window.sectionsManager.populateTable): Helper function to set the table. (window.benchmarkController.showResults): Refactor. (window.benchmarkController.selectResults): Update selectors. * Animometer/resources/runner/tests.js: Set Headers. Debug harness extends it. Update debug runner to have similar names to the basic runner. Include that page's CSS and remove extraneous CSS rules. Get rid of the statistics table #record. * Animometer/developer.html: Rename #home to #intro. Rename .spacer to hr. * Animometer/resources/debug-runner/animometer.css: Set to flexbox when selected. * Animometer/resources/debug-runner/animometer.js: Remove recordTable. (window.suitesManager._updateStartButtonState): Update selector to #intro. (setupRunningSectionStyle): Deleted. * Animometer/resources/runner/animometer.css: (#test-container.selected): Change to flex-box only when visible. Remove recordTable. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): * Animometer/resources/runner/tests.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * Animometer/tests/examples/resources/canvas-electrons.js: * Animometer/tests/examples/resources/canvas-stars.js: * Animometer/tests/misc/resources/compositing-transforms.js: * Animometer/tests/resources/main.js: * Animometer/tests/resources/stage.js: (StageBenchmark): Remove _recordTable. * Animometer/tests/simple/resources/simple-canvas-paths.js: * Animometer/tests/simple/resources/simple-canvas.js: * Animometer/tests/template/resources/template-canvas.js: * Animometer/tests/template/resources/template-css.js: * Animometer/tests/template/resources/template-svg.js: * Animometer/tests/text/resources/layering-text.js: * Animometer/resources/debug-runner/animometer.js: (willStartFirstIteration): Fix selector, since results-table is used in multiple places, so it cannot be an id. Make it possible to select the scores, or the whole table data, by cycling through different selections through key press of 's'. * Animometer/resources/runner/animometer.js: (window.benchmarkController.showResults): Attach a keypress handler if it hasn't been added already. (window.benchmarkController.selectResults): * Animometer/resources/runner/tests.js: Cycle through different ranges. Fix a few fly-by errors. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunnerState.prototype.prepareCurrentTest): Update the frame relative path since the files are now in the top directory instead of inside runner/. (BenchmarkRunner.prototype._runTestAndRecordResults): Incorrect reference to function. (BenchmarkRunner.prototype.step): Member variable is never used. A little stylistic cleanup. * Animometer/resources/debug-runner/benchmark-runner.js: * Animometer/resources/extensions.js: (window.DocumentExtension.createElement): * Animometer/tests/resources/main.js: (Benchmark.prototype.record): * Animometer/tests/resources/stage.js: (StageBenchmark.prototype.showResults): Reverse progress and message. The message appears less frequently than the progress. * Animometer/tests/simple/resources/simple-canvas.js: (SimpleCanvasBenchmark): Remove unused options. Add newer version of harness in a new page. Consolidate differences between the two pages. * Animometer/developer.html: Include runner/animometer.js. Rename the JS function to run the benchmark to startBenchmark() instead of startTest(). Rename #running to #test-container. * Animometer/index.html: Added. Similarly calls startBenchmark() and has #test-container. * Animometer/resources/debug-runner/animometer.css: Make the canvas 2:1 (1200px x 800px) instead of 4:3. Split out benchmarkRunnerClient and benchmarkController. * Animometer/resources/debug-runner/animometer.js: Move needed functions out of benchmarkRunnerClient, and leave the rest here to extend that object. Get rid of _resultsTable and move populating the results table into benchmarkController. Rename _resultsDashboard to results and make it accessible for other objects to use. (willAddTestFrame): This is unnecessary. Remove. (window.sectionsManager.showScore): Grab it from the results object instead of benchmarkRunnerClient. (window.sectionsManager.showSection): Deleted. Moved to runner/animometer.js. (window.benchmarkController._runBenchmark): Deleted. Mostly moved into _startBenchmark. (window.benchmarkController.startBenchmark): Refactor to call _startBenchmark. (window.benchmarkController.showResults): Include most of benchmarkRunnerClient.didFinishLastIteration() here. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._appendFrame): Remove unneeded call to willAddTestFrame. * Animometer/resources/extensions.js: (ResultsDashboard): Change the class to process the sampler data on-demand and hold onto that data for later referencing. (ResultsDashboard.prototype.toJSON): Deleted. (ResultsDashboard.prototype._processData): Rename toJSON to _processData since it's not really outputting JSON. Store the processed data into a member variable that can be referenced later. (ResultsDashboard.prototype.get data): Process the data if it hasn't already. (ResultsDashboard.prototype.get score): Process the data if it hasn't already, then return the aggregate score. (ResultsTable.prototype._showHeader): When outputting the results to a table, don't force the need for an empty children array. This was to allow for a header row in the table that spanned multiple columns. In the simpler harness, this is not needed. (ResultsTable.prototype._showEmptyCells): (ResultsTable.prototype._showTest): This hardcoded the columns. At least for the name and score, which is the bare minimum needed for the simpler harness, key off of the header name provided. * Animometer/resources/runner/animometer.css: Added. Use a similar 2:1 ratio. The score tables are split into the data and the headers, and are also displayed RTL so that a later patch allows a user to copy-paste the data easily. * Animometer/resources/runner/animometer.js: Added. Use a simpler version of benchmarkRunnerClient. The debug harness will extend these classes. (window.benchmarkController._startBenchmark): Used by both harnesses. (window.benchmarkController.startBenchmark): Set hard-coded options. (window.benchmarkController.showResults): Includes most of benchmarkRunnerClient.didFinishLastIteration() here. Get rid of utilities.js. Move it all into extensions.js. * Animometer/resources/extensions.js: * Animometer/tests/resources/utilities.js: Removed. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: Remove script link. * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: Ditto. * Animometer/tests/examples/canvas-electrons.html: Ditto. * Animometer/tests/examples/canvas-stars.html: Ditto. * Animometer/tests/misc/compositing-transforms.html: Ditto. * Animometer/tests/simple/simple-canvas-paths.html: Ditto. * Animometer/tests/template/template-canvas.html: Ditto. * Animometer/tests/template/template-css.html: Ditto. * Animometer/tests/template/template-svg.html: Ditto. * Animometer/tests/text/layering-text.html: Ditto. Split tests.js into two. Add a new suite to runner/tests.js. * Animometer/developer.html: Update the script order. Scripts from debug-runner/ will always build on those from runner/, and have the same name. * Animometer/resources/debug-runner/tests.js: Move "complex examples" suite into "miscellaneous tests". (Suite): Deleted. (Suite.prototype.prepare): Deleted. (Suite.prototype.run): Deleted. (suiteFromName): Deleted. * Animometer/resources/runner/tests.js: Added. Take definitions and functions needed by the test harness. Leave the test suites behind. (Suite): Moved from debug script. (Suite.prototype.prepare): Ditto. (Suite.prototype.run): Ditto. (suiteFromName): Ditto. (testFromName): Ditto. Move benchmark resources out into resources/debug-runner, and update URLs. * Animometer/developer.html: Renamed from PerformanceTests/Animometer/runner/animometer.html. * Animometer/resources/debug-runner/animometer.css: Renamed from PerformanceTests/Animometer/runner/resources/animometer.css. * Animometer/resources/debug-runner/animometer.js: Renamed from PerformanceTests/Animometer/runner/resources/animometer.js. * Animometer/resources/debug-runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/runner/resources/benchmark-runner.js. * Animometer/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/runner/resources/d3.min.js. * Animometer/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/runner/resources/graph.js. * Animometer/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/runner/resources/tests.js. Canonical link: https://commits.webkit.org/170645@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@194407 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2015-12-24 01:31:47 +00:00
<meta name="viewport" content="width=device-width, user-scalable=no">
Improvements to Animometer benchmark https://bugs.webkit.org/show_bug.cgi?id=157738 Reviewed by Dean Jackson. Provisionally reviewed by Said Abou-Hallawa. Include confidence interval for the final score, and store the canvas size in the serialization so that it is accurately shown in results. * Animometer/developer.html: Add a "confidence" div. * Animometer/index.html: Ditto. Convert "mean" to "confidence". * Animometer/resources/debug-runner/animometer.js: Look at options, and if the configuration is included, update the body class based on it (similar to what we do when we first load the page). That way, if you drag-and-drop previous results in, that configuration is reflected in the dashboard. Show the full confidence interval. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Style update. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: (_processData): Propagate the confidence interval values out and calculate the lower and upper bounds. For now, shortcut the aggregate calculation, since we only go through one iteration. (this._processData.calculateScore): Propagate the confidence interval out to be next to the score. Depending on the controller these values are calculated differently. (this._processData._getResultsProperty): Convenience function. (this._processData.get score): Refactor. (this._processData.get scoreLowerBound): Get the aggregate lower bound. (this._processData.get scoreUpperBound): Get the aggregate upper bound. (window.sectionsManager.setSectionScore): (window.benchmarkController._startBenchmark): When the benchmark starts, note the canvas size and add it to options. That way it will be automatically be serialized. (window.benchmarkController.showResults): Include the maximum deviation percentage. * Animometer/resources/runner/lines.svg: Make the background line up with the skew. * Animometer/resources/runner/tests.js: (Headers.details.text): Refactor. * Animometer/resources/statistics.js: (Statistics.largestDeviationPercentage): Convenience function to calculate the deviation percentage on either end and return the largest deviation. * Animometer/resources/strings.js: Allow specifying a regression profile to use instead of taking the one with the lowest error. Address an issue in the focus test when the regression calculated ends up overestimating the change point, causing a cascade of tougher ramps. The reason behind this is that at max complexity of an initial ramp, the frame length is very high, and it influences the second segment of the piecewise regression strongly, causing it to be very steep. As a result, the first segment, expected to be flat, ends up covering a higher range of complexity. That makes the change point much higher than it should be. To avoid this, we will add a sanity check on the maximum value of the ramp. If the regression's projected value at the maximum complexity of the current ramp is very slow (less than 20 fps), 1) reduce the maximum complexity by 20%, and 2) do not include the regression's change point in the change point estimator. That estimator is used as the midpoint of the next ramp, and including the change point from a poor regression can bake in the error. The controller already knows how to adjust for ramps that are too easy for the system. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Takes a preferred profile and gives that to Regression. (this._processData.calculateScore): With the ramp controller, take the profile of the ramp that was used the most when calculating the ramp's regression. That profile is what is used for the test's score. * Animometer/resources/statistics.js: (Regression.Utilities.createClass): Update to take an options object which can specify a profile to calculate with. Otherwise it will continue to use both and select the one with the lower error. (_calculateRegression): Fix an issue where we claim 0 error if the regression calculation fails due to divide-by-zero. Instead reject that regression calculation by giving it Number.MAX_VALUE. * Animometer/resources/strings.js: New strings for marking a regression as flat or slope. * Animometer/tests/resources/main.js: (RampController): Rename the thresholds for clarity. Add a threshold that, if exceeded, will lower the maximum complexity of the next ramp. (tune): Relax the cdf check to consider whether the interval definitely falls in the desired frame length threshold. (processSamples): Include the profile in the ramp. Update ramp controller test. Increase the length of the test to 30 seconds, and extend the interval to 120 ms during sampling. Improve the estimation of the ramp parameters. * Animometer/developer.html: Change default to 30 seconds, and don't show the progress bar by default. * Animometer/resources/runner/animometer.js: Change default to 30 seconds. * Animometer/tests/resources/main.js: A number of improvements to the ramp controller, in the order in which they appear in the patch: - With a longer test length use longer ramps with longer intervals to get more data at each complexity. Keep the 100 ms interval length during the ramp up phase since we don't need to spend more time there to find the right order of magnitude, but increase it during the ramps to 120 ms. - The ramp linearly interpolates the complexity to render based on its timestamp, but it would never sample the minimum complexity. Instead of lerping max to min complexity from time 0 to t where t is the ramp length, instead lerp from 0 to (t - intervalSampleLength) so that we can have at least one interval sample at the min complexity for that ramp. - Some regression calculations only come out with one line segment rather than the two we expect. This could be due to a noisy ramp or the ramp's range is too narrow. If that's the case, influence the minimum complexity of the next ramp towards the lowest bound of 1, so that we ensure that at least part of the ramp is covering a complexity range that the system can handle at full 60. - Remove an assignment to interpolatedFrameLength since that is never subsequently used. Update the format used to serialize the results for analysis. Each data point used to have named properties for fields like complexity and frame rate. In addition the serialized numbers had rounding errors that took up many characters. Update the format by introducing a new data container called SampleData, which contains a field map. The map maps a string to an array index. Each data point is an array, so, to get a stat, use the field map to get the array index into the data point. This allows future versions to track other data, and reduces the size of the output string by two-thirds. * Animometer/resources/extensions.js: (Utilities.toFixedNumber): Add convenience function that truncates the number to a fixed precision, and converts it back to a number. (SampleData): New class that contains sample data and a field map that maps properties to an array index. (get length): Number of data points. (addField): Add a field to the field map. (push): Add a data point. (sort): Sort the data. (slice): Return new SampleData object with sliced data. (forEach): Iterate over the data with provided function. (createDatum): (getFieldInDatum): Returns the data point associated with the field name by looking it up in the field map in the datum provided, which can be the datum object itself (an array) or an index into the data member variable. (setFieldInDatum): Sets the data point associated with the field name. (at): Returns the data point at the provided index. (toArray): Serializes the data where the field map serves as property names for each point. * Animometer/resources/debug-runner/graph.js: (updateGraphData): Remove unused _testData. Convert the data to the old array format for the graph to use, since the old format was much easier to work with when displaying the graphs. (onGraphTypeChanged): For some controllers, no alternative score or mean is provided. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: Refactor to use SampleData. Update JSON output to only go to 3 digits of precision for purposes of reducing the data size. * Animometer/resources/strings.js: Add new strings to put into the field maps. * Animometer/tests/resources/main.js: Refactor to use SampleData. * Animometer/developer.html: * Animometer/index.html: Restructure results table for both pages. Add charset attribute to tests.js include. * Animometer/resources/debug-runner/animometer.css: Clear out styles from release runner. * Animometer/resources/debug-runner/graph.js: (onGraphTypeChanged): Update score and mean if bootstrap results are available from the controller, since not all controllers do bootstrapping. * Animometer/resources/debug-runner/tests.js: Update header text. * Animometer/resources/runner/animometer.css: Include confidence interval in results. * Animometer/resources/runner/animometer.js: (ResultsTable._addHeader): Header contents can be HTML, so use innerHTML instead. (ResultsTable._addBody): Add tbody element. (ResultsTable._addTest): Allow a data cell to invoke a JS function to get its contents. (window.benchmarkController.showResults): Add table that includes tests' confidence intervals. * Animometer/resources/runner/tests.js: (Headers.details.text): Add new details table that includes bootstrap confidence interval. The interval can be asymmetric, but for simplicity, report the maximum deviation percentage on either side of the bootstrap median. * Animometer/resources/statistics.js: (bootstrap): Include the confidence percentage in the return object. Report canvas size in results. * Animometer/developer.html: Add markup to indicate whether a small, medium, or large canvas was used. * Animometer/index.html: Ditto. * Animometer/resources/debug-runner/animometer.js: Call determineCanvasSize(). * Animometer/resources/runner/animometer.css: Update styles to set the canvas based on the body class size. * Animometer/resources/runner/animometer.js: (window.benchmarkController.initialize): Update styles to set the canvas based on the body class size. (window.benchmarkController.determineCanvasSize): Run various media queries and set the body class based on the size of the device. * Animometer/developer.html: Refactor to include the main CSS file, and redo the layout so that it doesn't rely on flexbox. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: (updateDisplay): Since various parts of the script alter the body class, we can't replace the className directly. Instead, remove all display-based values and then add the one that was selected. * Animometer/resources/debug-runner/graph.js: (updateGraphData): To set the size of the graph, use window.innerHeight. * Animometer/resources/runner/animometer.js: (window.sectionsManager.showSection): Since various parts of the script alter the body class, we can't replace the className directly. Remove all of the section classes individually and then add the one desired. * Animometer/tests/resources/stage.css: Remove -apple-system as a font to use in the stage. Canonical link: https://commits.webkit.org/177081@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@202314 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-06-22 03:11:15 +00:00
[MotionMark] Add support for version numbers https://bugs.webkit.org/show_bug.cgi?id=186479 Reviewed by Said Abou-Hallawa. Add support for displaying the version number as well as including it in the JSON results. When loading the front page, script replaces any element with classname version with the version number of the benchmark, which is stored in Strings.version. The JSON structure for the results includes a new version property: { "version": "1.0", "options": { ... }, "data": [ ... ] } When dragging a results file, the version listed will come from the JSON file. Older results will not have had the version property, in which case it will default to "1.0". * MotionMark/index.html: Update title to some other default. Script will update it. Include the version number in the logo title. * MotionMark/developer.html: Ditto. * MotionMark/about.html: Ditto. * MotionMark/resources/runner/motionmark.js: (ResultsDashboard): Update constructor to include version. This is used when serializing results out to JSON, and displaying the results panel in developer mode. (ResultsDashboard._processData): When running the benchmark, include benchmark version string in the results object. (ResultsDashboard.version): (window.benchmarkRunnerClient.willStartFirstIteration): When running the benchmark, pass the benchmark version string to the dashboard, which holds the results. (window.sectionsManager.setSectionVersion): Helper function to update the element in the section with the class name version. (window.benchmarkController.initialize): Populate all DOM elements with class name "version" with the version string. Update the page title. (window.benchmarkController.showResults): When showing results, update the version string based on what is included in the JSON results, which would be the same as the benchmark version. * MotionMark/resources/runner/motionmark.css: Include missing copyright. Wrap the SVG logo in a div and include the version string. * MotionMark/resources/strings.js: Add strings for the page title template, and the version. * MotionMark/resources/debug-runner/motionmark.css: * MotionMark/resources/debug-runner/motionmark.js: (window.benchmarkRunnerClient.willStartFirstIteration): When running the benchmark, pass the benchmark version string to the dashboard, which holds the results. (window.benchmarkController.initialize): Populate all DOM elements with class name "version" with the version string. Update the page title. When dragging in JSON results, look for version to pass to the dashboard. If it doesn't exist, default to "1.0". (window.benchmarkController.showResults): When showing results, update the version string based on what is included in the JSON results, instead of the current benchmark version. * MotionMark/resources/debug-runner/tests.js: Update page title template. Canonical link: https://commits.webkit.org/202238@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@233147 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2018-06-25 15:38:03 +00:00
<title>MotionMark developer</title>
Improvements to Animometer benchmark https://bugs.webkit.org/show_bug.cgi?id=157738 Reviewed by Dean Jackson. Provisionally reviewed by Said Abou-Hallawa. Include confidence interval for the final score, and store the canvas size in the serialization so that it is accurately shown in results. * Animometer/developer.html: Add a "confidence" div. * Animometer/index.html: Ditto. Convert "mean" to "confidence". * Animometer/resources/debug-runner/animometer.js: Look at options, and if the configuration is included, update the body class based on it (similar to what we do when we first load the page). That way, if you drag-and-drop previous results in, that configuration is reflected in the dashboard. Show the full confidence interval. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Style update. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: (_processData): Propagate the confidence interval values out and calculate the lower and upper bounds. For now, shortcut the aggregate calculation, since we only go through one iteration. (this._processData.calculateScore): Propagate the confidence interval out to be next to the score. Depending on the controller these values are calculated differently. (this._processData._getResultsProperty): Convenience function. (this._processData.get score): Refactor. (this._processData.get scoreLowerBound): Get the aggregate lower bound. (this._processData.get scoreUpperBound): Get the aggregate upper bound. (window.sectionsManager.setSectionScore): (window.benchmarkController._startBenchmark): When the benchmark starts, note the canvas size and add it to options. That way it will be automatically be serialized. (window.benchmarkController.showResults): Include the maximum deviation percentage. * Animometer/resources/runner/lines.svg: Make the background line up with the skew. * Animometer/resources/runner/tests.js: (Headers.details.text): Refactor. * Animometer/resources/statistics.js: (Statistics.largestDeviationPercentage): Convenience function to calculate the deviation percentage on either end and return the largest deviation. * Animometer/resources/strings.js: Allow specifying a regression profile to use instead of taking the one with the lowest error. Address an issue in the focus test when the regression calculated ends up overestimating the change point, causing a cascade of tougher ramps. The reason behind this is that at max complexity of an initial ramp, the frame length is very high, and it influences the second segment of the piecewise regression strongly, causing it to be very steep. As a result, the first segment, expected to be flat, ends up covering a higher range of complexity. That makes the change point much higher than it should be. To avoid this, we will add a sanity check on the maximum value of the ramp. If the regression's projected value at the maximum complexity of the current ramp is very slow (less than 20 fps), 1) reduce the maximum complexity by 20%, and 2) do not include the regression's change point in the change point estimator. That estimator is used as the midpoint of the next ramp, and including the change point from a poor regression can bake in the error. The controller already knows how to adjust for ramps that are too easy for the system. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Takes a preferred profile and gives that to Regression. (this._processData.calculateScore): With the ramp controller, take the profile of the ramp that was used the most when calculating the ramp's regression. That profile is what is used for the test's score. * Animometer/resources/statistics.js: (Regression.Utilities.createClass): Update to take an options object which can specify a profile to calculate with. Otherwise it will continue to use both and select the one with the lower error. (_calculateRegression): Fix an issue where we claim 0 error if the regression calculation fails due to divide-by-zero. Instead reject that regression calculation by giving it Number.MAX_VALUE. * Animometer/resources/strings.js: New strings for marking a regression as flat or slope. * Animometer/tests/resources/main.js: (RampController): Rename the thresholds for clarity. Add a threshold that, if exceeded, will lower the maximum complexity of the next ramp. (tune): Relax the cdf check to consider whether the interval definitely falls in the desired frame length threshold. (processSamples): Include the profile in the ramp. Update ramp controller test. Increase the length of the test to 30 seconds, and extend the interval to 120 ms during sampling. Improve the estimation of the ramp parameters. * Animometer/developer.html: Change default to 30 seconds, and don't show the progress bar by default. * Animometer/resources/runner/animometer.js: Change default to 30 seconds. * Animometer/tests/resources/main.js: A number of improvements to the ramp controller, in the order in which they appear in the patch: - With a longer test length use longer ramps with longer intervals to get more data at each complexity. Keep the 100 ms interval length during the ramp up phase since we don't need to spend more time there to find the right order of magnitude, but increase it during the ramps to 120 ms. - The ramp linearly interpolates the complexity to render based on its timestamp, but it would never sample the minimum complexity. Instead of lerping max to min complexity from time 0 to t where t is the ramp length, instead lerp from 0 to (t - intervalSampleLength) so that we can have at least one interval sample at the min complexity for that ramp. - Some regression calculations only come out with one line segment rather than the two we expect. This could be due to a noisy ramp or the ramp's range is too narrow. If that's the case, influence the minimum complexity of the next ramp towards the lowest bound of 1, so that we ensure that at least part of the ramp is covering a complexity range that the system can handle at full 60. - Remove an assignment to interpolatedFrameLength since that is never subsequently used. Update the format used to serialize the results for analysis. Each data point used to have named properties for fields like complexity and frame rate. In addition the serialized numbers had rounding errors that took up many characters. Update the format by introducing a new data container called SampleData, which contains a field map. The map maps a string to an array index. Each data point is an array, so, to get a stat, use the field map to get the array index into the data point. This allows future versions to track other data, and reduces the size of the output string by two-thirds. * Animometer/resources/extensions.js: (Utilities.toFixedNumber): Add convenience function that truncates the number to a fixed precision, and converts it back to a number. (SampleData): New class that contains sample data and a field map that maps properties to an array index. (get length): Number of data points. (addField): Add a field to the field map. (push): Add a data point. (sort): Sort the data. (slice): Return new SampleData object with sliced data. (forEach): Iterate over the data with provided function. (createDatum): (getFieldInDatum): Returns the data point associated with the field name by looking it up in the field map in the datum provided, which can be the datum object itself (an array) or an index into the data member variable. (setFieldInDatum): Sets the data point associated with the field name. (at): Returns the data point at the provided index. (toArray): Serializes the data where the field map serves as property names for each point. * Animometer/resources/debug-runner/graph.js: (updateGraphData): Remove unused _testData. Convert the data to the old array format for the graph to use, since the old format was much easier to work with when displaying the graphs. (onGraphTypeChanged): For some controllers, no alternative score or mean is provided. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: Refactor to use SampleData. Update JSON output to only go to 3 digits of precision for purposes of reducing the data size. * Animometer/resources/strings.js: Add new strings to put into the field maps. * Animometer/tests/resources/main.js: Refactor to use SampleData. * Animometer/developer.html: * Animometer/index.html: Restructure results table for both pages. Add charset attribute to tests.js include. * Animometer/resources/debug-runner/animometer.css: Clear out styles from release runner. * Animometer/resources/debug-runner/graph.js: (onGraphTypeChanged): Update score and mean if bootstrap results are available from the controller, since not all controllers do bootstrapping. * Animometer/resources/debug-runner/tests.js: Update header text. * Animometer/resources/runner/animometer.css: Include confidence interval in results. * Animometer/resources/runner/animometer.js: (ResultsTable._addHeader): Header contents can be HTML, so use innerHTML instead. (ResultsTable._addBody): Add tbody element. (ResultsTable._addTest): Allow a data cell to invoke a JS function to get its contents. (window.benchmarkController.showResults): Add table that includes tests' confidence intervals. * Animometer/resources/runner/tests.js: (Headers.details.text): Add new details table that includes bootstrap confidence interval. The interval can be asymmetric, but for simplicity, report the maximum deviation percentage on either side of the bootstrap median. * Animometer/resources/statistics.js: (bootstrap): Include the confidence percentage in the return object. Report canvas size in results. * Animometer/developer.html: Add markup to indicate whether a small, medium, or large canvas was used. * Animometer/index.html: Ditto. * Animometer/resources/debug-runner/animometer.js: Call determineCanvasSize(). * Animometer/resources/runner/animometer.css: Update styles to set the canvas based on the body class size. * Animometer/resources/runner/animometer.js: (window.benchmarkController.initialize): Update styles to set the canvas based on the body class size. (window.benchmarkController.determineCanvasSize): Run various media queries and set the body class based on the size of the device. * Animometer/developer.html: Refactor to include the main CSS file, and redo the layout so that it doesn't rely on flexbox. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: (updateDisplay): Since various parts of the script alter the body class, we can't replace the className directly. Instead, remove all display-based values and then add the one that was selected. * Animometer/resources/debug-runner/graph.js: (updateGraphData): To set the size of the graph, use window.innerHeight. * Animometer/resources/runner/animometer.js: (window.sectionsManager.showSection): Since various parts of the script alter the body class, we can't replace the className directly. Remove all of the section classes individually and then add the one desired. * Animometer/tests/resources/stage.css: Remove -apple-system as a font to use in the stage. Canonical link: https://commits.webkit.org/177081@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@202314 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-06-22 03:11:15 +00:00
Rename the directory and the files of MotionMark from Animometer to MotionMark https://bugs.webkit.org/show_bug.cgi?id=166659 Patch by Said Abou-Hallawa <sabouhallawa@apple.com> on 2017-01-06 Reviewed by Jon Lee. Rename the directory and the files of the benchmark to its new name. * MotionMark/developer.html: Renamed from PerformanceTests/Animometer/developer.html. * MotionMark/index.html: Renamed from PerformanceTests/Animometer/index.html. * MotionMark/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/resources/debug-runner/d3.min.js. * MotionMark/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/resources/debug-runner/graph.js. * MotionMark/resources/debug-runner/motionmark.css: Renamed from PerformanceTests/Animometer/resources/debug-runner/animometer.css. * MotionMark/resources/debug-runner/motionmark.js: Renamed from PerformanceTests/Animometer/resources/debug-runner/animometer.js. * MotionMark/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/resources/debug-runner/tests.js. * MotionMark/resources/extensions.js: Renamed from PerformanceTests/Animometer/resources/extensions.js. * MotionMark/resources/runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/resources/runner/benchmark-runner.js. * MotionMark/resources/runner/crystal.svg: Renamed from PerformanceTests/Animometer/resources/runner/crystal.svg. * MotionMark/resources/runner/lines.svg: Renamed from PerformanceTests/Animometer/resources/runner/lines.svg. * MotionMark/resources/runner/logo.svg: Renamed from PerformanceTests/Animometer/resources/runner/logo.svg. * MotionMark/resources/runner/motionmark.css: Renamed from PerformanceTests/Animometer/resources/runner/animometer.css. * MotionMark/resources/runner/motionmark.js: Renamed from PerformanceTests/Animometer/resources/runner/animometer.js. * MotionMark/resources/runner/tests.js: Renamed from PerformanceTests/Animometer/resources/runner/tests.js. * MotionMark/resources/statistics.js: Renamed from PerformanceTests/Animometer/resources/statistics.js. * MotionMark/resources/strings.js: Renamed from PerformanceTests/Animometer/resources/strings.js. * MotionMark/tests/3d/resources/webgl.js: Renamed from PerformanceTests/Animometer/tests/3d/resources/webgl.js. * MotionMark/tests/3d/webgl.html: Renamed from PerformanceTests/Animometer/tests/3d/webgl.html. * MotionMark/tests/bouncing-particles/bouncing-canvas-images.html: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/bouncing-canvas-images.html. * MotionMark/tests/bouncing-particles/bouncing-canvas-shapes.html: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html. * MotionMark/tests/bouncing-particles/bouncing-css-images.html: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/bouncing-css-images.html. * MotionMark/tests/bouncing-particles/bouncing-css-shapes.html: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/bouncing-css-shapes.html. * MotionMark/tests/bouncing-particles/bouncing-svg-images.html: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/bouncing-svg-images.html. * MotionMark/tests/bouncing-particles/bouncing-svg-shapes.html: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/bouncing-svg-shapes.html. * MotionMark/tests/bouncing-particles/bouncing-tagged-images.html: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/bouncing-tagged-images.html. * MotionMark/tests/bouncing-particles/resources/bouncing-canvas-images.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js. * MotionMark/tests/bouncing-particles/resources/bouncing-canvas-particles.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js. * MotionMark/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js. * MotionMark/tests/bouncing-particles/resources/bouncing-css-images.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-css-images.js. * MotionMark/tests/bouncing-particles/resources/bouncing-css-shapes.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js. * MotionMark/tests/bouncing-particles/resources/bouncing-svg-images.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js. * MotionMark/tests/bouncing-particles/resources/bouncing-svg-particles.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-svg-particles.js. * MotionMark/tests/bouncing-particles/resources/bouncing-svg-shapes.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js. * MotionMark/tests/bouncing-particles/resources/bouncing-tagged-images.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-tagged-images.js. * MotionMark/tests/bouncing-particles/resources/image1.jpg: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/image1.jpg. * MotionMark/tests/bouncing-particles/resources/image2.jpg: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/image2.jpg. * MotionMark/tests/bouncing-particles/resources/image3.jpg: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/image3.jpg. * MotionMark/tests/bouncing-particles/resources/image4.jpg: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/image4.jpg. * MotionMark/tests/bouncing-particles/resources/image5.jpg: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/image5.jpg. * MotionMark/tests/dom/compositing-transforms.html: Renamed from PerformanceTests/Animometer/tests/dom/compositing-transforms.html. * MotionMark/tests/dom/focus.html: Renamed from PerformanceTests/Animometer/tests/dom/focus.html. * MotionMark/tests/dom/leaves.html: Renamed from PerformanceTests/Animometer/tests/dom/leaves.html. * MotionMark/tests/dom/particles.html: Renamed from PerformanceTests/Animometer/tests/dom/particles.html. * MotionMark/tests/dom/resources/compositing-transforms.js: Renamed from PerformanceTests/Animometer/tests/dom/resources/compositing-transforms.js. * MotionMark/tests/dom/resources/dom-particles.js: Renamed from PerformanceTests/Animometer/tests/dom/resources/dom-particles.js. * MotionMark/tests/dom/resources/focus.js: Renamed from PerformanceTests/Animometer/tests/dom/resources/focus.js. * MotionMark/tests/dom/resources/leaves.js: Renamed from PerformanceTests/Animometer/tests/dom/resources/leaves.js. * MotionMark/tests/master/canvas-stage.html: Renamed from PerformanceTests/Animometer/tests/master/canvas-stage.html. * MotionMark/tests/master/focus.html: Renamed from PerformanceTests/Animometer/tests/master/focus.html. * MotionMark/tests/master/image-data.html: Renamed from PerformanceTests/Animometer/tests/master/image-data.html. * MotionMark/tests/master/leaves.html: Renamed from PerformanceTests/Animometer/tests/master/leaves.html. * MotionMark/tests/master/multiply.html: Renamed from PerformanceTests/Animometer/tests/master/multiply.html. * MotionMark/tests/master/resources/canvas-stage.js: Renamed from PerformanceTests/Animometer/tests/master/resources/canvas-stage.js. * MotionMark/tests/master/resources/canvas-tests.js: Renamed from PerformanceTests/Animometer/tests/master/resources/canvas-tests.js. * MotionMark/tests/master/resources/compass.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/compass.svg. * MotionMark/tests/master/resources/compass100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/compass100.png. * MotionMark/tests/master/resources/console.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/console.svg. * MotionMark/tests/master/resources/console100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/console100.png. * MotionMark/tests/master/resources/contribute.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/contribute.svg. * MotionMark/tests/master/resources/contribute100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/contribute100.png. * MotionMark/tests/master/resources/debugger.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/debugger.svg. * MotionMark/tests/master/resources/debugger100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/debugger100.png. * MotionMark/tests/master/resources/focus.js: Renamed from PerformanceTests/Animometer/tests/master/resources/focus.js. * MotionMark/tests/master/resources/image-data.js: Renamed from PerformanceTests/Animometer/tests/master/resources/image-data.js. * MotionMark/tests/master/resources/inspector.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/inspector.svg. * MotionMark/tests/master/resources/inspector100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/inspector100.png. * MotionMark/tests/master/resources/layout.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/layout.svg. * MotionMark/tests/master/resources/layout100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/layout100.png. * MotionMark/tests/master/resources/leaves.js: Renamed from PerformanceTests/Animometer/tests/master/resources/leaves.js. * MotionMark/tests/master/resources/multiply.js: Renamed from PerformanceTests/Animometer/tests/master/resources/multiply.js. * MotionMark/tests/master/resources/particles.js: Renamed from PerformanceTests/Animometer/tests/master/resources/particles.js. * MotionMark/tests/master/resources/performance.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/performance.svg. * MotionMark/tests/master/resources/performance100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/performance100.png. * MotionMark/tests/master/resources/script.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/script.svg. * MotionMark/tests/master/resources/script100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/script100.png. * MotionMark/tests/master/resources/shortcuts.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/shortcuts.svg. * MotionMark/tests/master/resources/shortcuts100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/shortcuts100.png. * MotionMark/tests/master/resources/standards.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/standards.svg. * MotionMark/tests/master/resources/standards100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/standards100.png. * MotionMark/tests/master/resources/storage.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/storage.svg. * MotionMark/tests/master/resources/storage100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/storage100.png. * MotionMark/tests/master/resources/styles.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/styles.svg. * MotionMark/tests/master/resources/styles100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/styles100.png. * MotionMark/tests/master/resources/svg-particles.js: Renamed from PerformanceTests/Animometer/tests/master/resources/svg-particles.js. * MotionMark/tests/master/resources/text.js: Renamed from PerformanceTests/Animometer/tests/master/resources/text.js. * MotionMark/tests/master/resources/timeline.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/timeline.svg. * MotionMark/tests/master/resources/timeline100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/timeline100.png. * MotionMark/tests/master/svg-particles.html: Renamed from PerformanceTests/Animometer/tests/master/svg-particles.html. * MotionMark/tests/master/text.html: Renamed from PerformanceTests/Animometer/tests/master/text.html. * MotionMark/tests/resources/main.js: Renamed from PerformanceTests/Animometer/tests/resources/main.js. * MotionMark/tests/resources/math.js: Renamed from PerformanceTests/Animometer/tests/resources/math.js. * MotionMark/tests/resources/stage.css: Renamed from PerformanceTests/Animometer/tests/resources/stage.css. * MotionMark/tests/resources/star.svg: Renamed from PerformanceTests/Animometer/tests/resources/star.svg. * MotionMark/tests/resources/yin-yang.png: Renamed from PerformanceTests/Animometer/tests/resources/yin-yang.png. * MotionMark/tests/resources/yin-yang.svg: Renamed from PerformanceTests/Animometer/tests/resources/yin-yang.svg. * MotionMark/tests/simple/resources/simple-canvas-paths.js: Renamed from PerformanceTests/Animometer/tests/simple/resources/simple-canvas-paths.js. * MotionMark/tests/simple/resources/simple-canvas.js: Renamed from PerformanceTests/Animometer/tests/simple/resources/simple-canvas.js. * MotionMark/tests/simple/resources/tiled-canvas-image.js: Renamed from PerformanceTests/Animometer/tests/simple/resources/tiled-canvas-image.js. * MotionMark/tests/simple/simple-canvas-paths.html: Renamed from PerformanceTests/Animometer/tests/simple/simple-canvas-paths.html. * MotionMark/tests/simple/tiled-canvas-image.html: Renamed from PerformanceTests/Animometer/tests/simple/tiled-canvas-image.html. * MotionMark/tests/template/resources/template-canvas.js: Renamed from PerformanceTests/Animometer/tests/template/resources/template-canvas.js. * MotionMark/tests/template/resources/template-css.js: Renamed from PerformanceTests/Animometer/tests/template/resources/template-css.js. * MotionMark/tests/template/resources/template-svg.js: Renamed from PerformanceTests/Animometer/tests/template/resources/template-svg.js. * MotionMark/tests/template/template-canvas.html: Renamed from PerformanceTests/Animometer/tests/template/template-canvas.html. * MotionMark/tests/template/template-css.html: Renamed from PerformanceTests/Animometer/tests/template/template-css.html. * MotionMark/tests/template/template-svg.html: Renamed from PerformanceTests/Animometer/tests/template/template-svg.html. Canonical link: https://commits.webkit.org/183912@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@210459 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2017-01-06 23:38:32 +00:00
<link rel="stylesheet" href="resources/runner/motionmark.css">
<link rel="stylesheet" href="resources/debug-runner/motionmark.css">
Improvements to Animometer benchmark https://bugs.webkit.org/show_bug.cgi?id=157738 Reviewed by Dean Jackson. Provisionally reviewed by Said Abou-Hallawa. Include confidence interval for the final score, and store the canvas size in the serialization so that it is accurately shown in results. * Animometer/developer.html: Add a "confidence" div. * Animometer/index.html: Ditto. Convert "mean" to "confidence". * Animometer/resources/debug-runner/animometer.js: Look at options, and if the configuration is included, update the body class based on it (similar to what we do when we first load the page). That way, if you drag-and-drop previous results in, that configuration is reflected in the dashboard. Show the full confidence interval. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Style update. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: (_processData): Propagate the confidence interval values out and calculate the lower and upper bounds. For now, shortcut the aggregate calculation, since we only go through one iteration. (this._processData.calculateScore): Propagate the confidence interval out to be next to the score. Depending on the controller these values are calculated differently. (this._processData._getResultsProperty): Convenience function. (this._processData.get score): Refactor. (this._processData.get scoreLowerBound): Get the aggregate lower bound. (this._processData.get scoreUpperBound): Get the aggregate upper bound. (window.sectionsManager.setSectionScore): (window.benchmarkController._startBenchmark): When the benchmark starts, note the canvas size and add it to options. That way it will be automatically be serialized. (window.benchmarkController.showResults): Include the maximum deviation percentage. * Animometer/resources/runner/lines.svg: Make the background line up with the skew. * Animometer/resources/runner/tests.js: (Headers.details.text): Refactor. * Animometer/resources/statistics.js: (Statistics.largestDeviationPercentage): Convenience function to calculate the deviation percentage on either end and return the largest deviation. * Animometer/resources/strings.js: Allow specifying a regression profile to use instead of taking the one with the lowest error. Address an issue in the focus test when the regression calculated ends up overestimating the change point, causing a cascade of tougher ramps. The reason behind this is that at max complexity of an initial ramp, the frame length is very high, and it influences the second segment of the piecewise regression strongly, causing it to be very steep. As a result, the first segment, expected to be flat, ends up covering a higher range of complexity. That makes the change point much higher than it should be. To avoid this, we will add a sanity check on the maximum value of the ramp. If the regression's projected value at the maximum complexity of the current ramp is very slow (less than 20 fps), 1) reduce the maximum complexity by 20%, and 2) do not include the regression's change point in the change point estimator. That estimator is used as the midpoint of the next ramp, and including the change point from a poor regression can bake in the error. The controller already knows how to adjust for ramps that are too easy for the system. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Takes a preferred profile and gives that to Regression. (this._processData.calculateScore): With the ramp controller, take the profile of the ramp that was used the most when calculating the ramp's regression. That profile is what is used for the test's score. * Animometer/resources/statistics.js: (Regression.Utilities.createClass): Update to take an options object which can specify a profile to calculate with. Otherwise it will continue to use both and select the one with the lower error. (_calculateRegression): Fix an issue where we claim 0 error if the regression calculation fails due to divide-by-zero. Instead reject that regression calculation by giving it Number.MAX_VALUE. * Animometer/resources/strings.js: New strings for marking a regression as flat or slope. * Animometer/tests/resources/main.js: (RampController): Rename the thresholds for clarity. Add a threshold that, if exceeded, will lower the maximum complexity of the next ramp. (tune): Relax the cdf check to consider whether the interval definitely falls in the desired frame length threshold. (processSamples): Include the profile in the ramp. Update ramp controller test. Increase the length of the test to 30 seconds, and extend the interval to 120 ms during sampling. Improve the estimation of the ramp parameters. * Animometer/developer.html: Change default to 30 seconds, and don't show the progress bar by default. * Animometer/resources/runner/animometer.js: Change default to 30 seconds. * Animometer/tests/resources/main.js: A number of improvements to the ramp controller, in the order in which they appear in the patch: - With a longer test length use longer ramps with longer intervals to get more data at each complexity. Keep the 100 ms interval length during the ramp up phase since we don't need to spend more time there to find the right order of magnitude, but increase it during the ramps to 120 ms. - The ramp linearly interpolates the complexity to render based on its timestamp, but it would never sample the minimum complexity. Instead of lerping max to min complexity from time 0 to t where t is the ramp length, instead lerp from 0 to (t - intervalSampleLength) so that we can have at least one interval sample at the min complexity for that ramp. - Some regression calculations only come out with one line segment rather than the two we expect. This could be due to a noisy ramp or the ramp's range is too narrow. If that's the case, influence the minimum complexity of the next ramp towards the lowest bound of 1, so that we ensure that at least part of the ramp is covering a complexity range that the system can handle at full 60. - Remove an assignment to interpolatedFrameLength since that is never subsequently used. Update the format used to serialize the results for analysis. Each data point used to have named properties for fields like complexity and frame rate. In addition the serialized numbers had rounding errors that took up many characters. Update the format by introducing a new data container called SampleData, which contains a field map. The map maps a string to an array index. Each data point is an array, so, to get a stat, use the field map to get the array index into the data point. This allows future versions to track other data, and reduces the size of the output string by two-thirds. * Animometer/resources/extensions.js: (Utilities.toFixedNumber): Add convenience function that truncates the number to a fixed precision, and converts it back to a number. (SampleData): New class that contains sample data and a field map that maps properties to an array index. (get length): Number of data points. (addField): Add a field to the field map. (push): Add a data point. (sort): Sort the data. (slice): Return new SampleData object with sliced data. (forEach): Iterate over the data with provided function. (createDatum): (getFieldInDatum): Returns the data point associated with the field name by looking it up in the field map in the datum provided, which can be the datum object itself (an array) or an index into the data member variable. (setFieldInDatum): Sets the data point associated with the field name. (at): Returns the data point at the provided index. (toArray): Serializes the data where the field map serves as property names for each point. * Animometer/resources/debug-runner/graph.js: (updateGraphData): Remove unused _testData. Convert the data to the old array format for the graph to use, since the old format was much easier to work with when displaying the graphs. (onGraphTypeChanged): For some controllers, no alternative score or mean is provided. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: Refactor to use SampleData. Update JSON output to only go to 3 digits of precision for purposes of reducing the data size. * Animometer/resources/strings.js: Add new strings to put into the field maps. * Animometer/tests/resources/main.js: Refactor to use SampleData. * Animometer/developer.html: * Animometer/index.html: Restructure results table for both pages. Add charset attribute to tests.js include. * Animometer/resources/debug-runner/animometer.css: Clear out styles from release runner. * Animometer/resources/debug-runner/graph.js: (onGraphTypeChanged): Update score and mean if bootstrap results are available from the controller, since not all controllers do bootstrapping. * Animometer/resources/debug-runner/tests.js: Update header text. * Animometer/resources/runner/animometer.css: Include confidence interval in results. * Animometer/resources/runner/animometer.js: (ResultsTable._addHeader): Header contents can be HTML, so use innerHTML instead. (ResultsTable._addBody): Add tbody element. (ResultsTable._addTest): Allow a data cell to invoke a JS function to get its contents. (window.benchmarkController.showResults): Add table that includes tests' confidence intervals. * Animometer/resources/runner/tests.js: (Headers.details.text): Add new details table that includes bootstrap confidence interval. The interval can be asymmetric, but for simplicity, report the maximum deviation percentage on either side of the bootstrap median. * Animometer/resources/statistics.js: (bootstrap): Include the confidence percentage in the return object. Report canvas size in results. * Animometer/developer.html: Add markup to indicate whether a small, medium, or large canvas was used. * Animometer/index.html: Ditto. * Animometer/resources/debug-runner/animometer.js: Call determineCanvasSize(). * Animometer/resources/runner/animometer.css: Update styles to set the canvas based on the body class size. * Animometer/resources/runner/animometer.js: (window.benchmarkController.initialize): Update styles to set the canvas based on the body class size. (window.benchmarkController.determineCanvasSize): Run various media queries and set the body class based on the size of the device. * Animometer/developer.html: Refactor to include the main CSS file, and redo the layout so that it doesn't rely on flexbox. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: (updateDisplay): Since various parts of the script alter the body class, we can't replace the className directly. Instead, remove all display-based values and then add the one that was selected. * Animometer/resources/debug-runner/graph.js: (updateGraphData): To set the size of the graph, use window.innerHeight. * Animometer/resources/runner/animometer.js: (window.sectionsManager.showSection): Since various parts of the script alter the body class, we can't replace the className directly. Remove all of the section classes individually and then add the one desired. * Animometer/tests/resources/stage.css: Remove -apple-system as a font to use in the stage. Canonical link: https://commits.webkit.org/177081@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@202314 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-06-22 03:11:15 +00:00
Split benchmark into two different pages https://bugs.webkit.org/show_bug.cgi?id=152458 Reviewed by Simon Fraser. Address comments. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): When the testing is complete the frame owning the sampler goes away, and a later call to get the JSON data is no longer available. Process the data right here, instead, and just reference it when displaying the results in ResultsDashboard.prototype._processData. * Animometer/resources/extensions.js: (Array.prototype.fill.Array.prototype.fill): Add a null check. Remove braces around single-line clause. (Array.prototype.find.Array.prototype.find): Update the null check. (ResultsDashboard.prototype._processData): Use the already-processed data. * Animometer/resources/runner/animometer.css: (.frame-container > iframe): Remove calc(). Move Array functions to extensions.js since that is included by the harness. Add ES6 Array polyfills. * Animometer/resources/algorithm.js: (Array.prototype.swap): Moved to extensions.js. * Animometer/resources/extensions.js: (Array.prototype.swap): (Array.prototype.fill): Added. (Array.prototype.find): Added. Adjust styles for iPad. * Animometer/resources/runner/animometer.css: (@media screen and (min-device-width: 768px)): Apply to iPad as well. (@media screen and (max-device-width: 1024px)): Update width for iPads. Adjustment styles for iOS. * Animometer/developer.html: Different divs contain the iframe, so use a class instead and update the style rules. * Animometer/index.html: * Animometer/resources/debug-runner/animometer.css: Remove extraneous rules. (@media screen and (min-device-width: 1800px)): Move this up. * Animometer/resources/runner/animometer.css: Add rules to accomodate iOS. Get rid of prefixed flex properties for now. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/runner/animometer.css: Update the structure of the harness. Remove the JSON-per-test but keep the JSON of the whole test run. Use the full page in order to display the graph. * Animometer/developer.html: Update several of the JS file includes to UTF-8. Remove header and footer. Test results screen includes score, average, and worst 5% statistics. * Animometer/index.html: Make structure similar to developer.html. * Animometer/resources/debug-runner/animometer.css: Remove most of the button rules since they are superfluous. Move the progress bar to the top, fixed. Update the results page rules. * Animometer/resources/debug-runner/animometer.js: Remove most of the additions to sectionsManager since they are no longer needed. (setSectionHeader): Updates header of the section. (window.suitesManager._updateStartButtonState): Update selector. (showResults): Add keypress event for selecting different data for copy/paste. Update how the results are populated. Include full test JSON in a textarea, rather than requiring a button press. (showTestGraph): * Animometer/resources/debug-runner/tests.js: Update structure of Headers. Define different kinds of headers. Headers can control their title, and the text used as the cell contents, including class name. * Animometer/resources/extensions.js: (ResultsTable): Update to include a flattened version of the headers, used while populating table contents. Remove unneeded helper functions for creating the table. Rename "show" to "add". * Animometer/resources/runner/animometer.css: Update rules to accommodate the new structure. * Animometer/resources/runner/animometer.js: (window.sectionsManager.setSectionScore): Helper function to set the score and mean for a section. (window.sectionsManager.populateTable): Helper function to set the table. (window.benchmarkController.showResults): Refactor. (window.benchmarkController.selectResults): Update selectors. * Animometer/resources/runner/tests.js: Set Headers. Debug harness extends it. Update debug runner to have similar names to the basic runner. Include that page's CSS and remove extraneous CSS rules. Get rid of the statistics table #record. * Animometer/developer.html: Rename #home to #intro. Rename .spacer to hr. * Animometer/resources/debug-runner/animometer.css: Set to flexbox when selected. * Animometer/resources/debug-runner/animometer.js: Remove recordTable. (window.suitesManager._updateStartButtonState): Update selector to #intro. (setupRunningSectionStyle): Deleted. * Animometer/resources/runner/animometer.css: (#test-container.selected): Change to flex-box only when visible. Remove recordTable. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): * Animometer/resources/runner/tests.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * Animometer/tests/examples/resources/canvas-electrons.js: * Animometer/tests/examples/resources/canvas-stars.js: * Animometer/tests/misc/resources/compositing-transforms.js: * Animometer/tests/resources/main.js: * Animometer/tests/resources/stage.js: (StageBenchmark): Remove _recordTable. * Animometer/tests/simple/resources/simple-canvas-paths.js: * Animometer/tests/simple/resources/simple-canvas.js: * Animometer/tests/template/resources/template-canvas.js: * Animometer/tests/template/resources/template-css.js: * Animometer/tests/template/resources/template-svg.js: * Animometer/tests/text/resources/layering-text.js: * Animometer/resources/debug-runner/animometer.js: (willStartFirstIteration): Fix selector, since results-table is used in multiple places, so it cannot be an id. Make it possible to select the scores, or the whole table data, by cycling through different selections through key press of 's'. * Animometer/resources/runner/animometer.js: (window.benchmarkController.showResults): Attach a keypress handler if it hasn't been added already. (window.benchmarkController.selectResults): * Animometer/resources/runner/tests.js: Cycle through different ranges. Fix a few fly-by errors. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunnerState.prototype.prepareCurrentTest): Update the frame relative path since the files are now in the top directory instead of inside runner/. (BenchmarkRunner.prototype._runTestAndRecordResults): Incorrect reference to function. (BenchmarkRunner.prototype.step): Member variable is never used. A little stylistic cleanup. * Animometer/resources/debug-runner/benchmark-runner.js: * Animometer/resources/extensions.js: (window.DocumentExtension.createElement): * Animometer/tests/resources/main.js: (Benchmark.prototype.record): * Animometer/tests/resources/stage.js: (StageBenchmark.prototype.showResults): Reverse progress and message. The message appears less frequently than the progress. * Animometer/tests/simple/resources/simple-canvas.js: (SimpleCanvasBenchmark): Remove unused options. Add newer version of harness in a new page. Consolidate differences between the two pages. * Animometer/developer.html: Include runner/animometer.js. Rename the JS function to run the benchmark to startBenchmark() instead of startTest(). Rename #running to #test-container. * Animometer/index.html: Added. Similarly calls startBenchmark() and has #test-container. * Animometer/resources/debug-runner/animometer.css: Make the canvas 2:1 (1200px x 800px) instead of 4:3. Split out benchmarkRunnerClient and benchmarkController. * Animometer/resources/debug-runner/animometer.js: Move needed functions out of benchmarkRunnerClient, and leave the rest here to extend that object. Get rid of _resultsTable and move populating the results table into benchmarkController. Rename _resultsDashboard to results and make it accessible for other objects to use. (willAddTestFrame): This is unnecessary. Remove. (window.sectionsManager.showScore): Grab it from the results object instead of benchmarkRunnerClient. (window.sectionsManager.showSection): Deleted. Moved to runner/animometer.js. (window.benchmarkController._runBenchmark): Deleted. Mostly moved into _startBenchmark. (window.benchmarkController.startBenchmark): Refactor to call _startBenchmark. (window.benchmarkController.showResults): Include most of benchmarkRunnerClient.didFinishLastIteration() here. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._appendFrame): Remove unneeded call to willAddTestFrame. * Animometer/resources/extensions.js: (ResultsDashboard): Change the class to process the sampler data on-demand and hold onto that data for later referencing. (ResultsDashboard.prototype.toJSON): Deleted. (ResultsDashboard.prototype._processData): Rename toJSON to _processData since it's not really outputting JSON. Store the processed data into a member variable that can be referenced later. (ResultsDashboard.prototype.get data): Process the data if it hasn't already. (ResultsDashboard.prototype.get score): Process the data if it hasn't already, then return the aggregate score. (ResultsTable.prototype._showHeader): When outputting the results to a table, don't force the need for an empty children array. This was to allow for a header row in the table that spanned multiple columns. In the simpler harness, this is not needed. (ResultsTable.prototype._showEmptyCells): (ResultsTable.prototype._showTest): This hardcoded the columns. At least for the name and score, which is the bare minimum needed for the simpler harness, key off of the header name provided. * Animometer/resources/runner/animometer.css: Added. Use a similar 2:1 ratio. The score tables are split into the data and the headers, and are also displayed RTL so that a later patch allows a user to copy-paste the data easily. * Animometer/resources/runner/animometer.js: Added. Use a simpler version of benchmarkRunnerClient. The debug harness will extend these classes. (window.benchmarkController._startBenchmark): Used by both harnesses. (window.benchmarkController.startBenchmark): Set hard-coded options. (window.benchmarkController.showResults): Includes most of benchmarkRunnerClient.didFinishLastIteration() here. Get rid of utilities.js. Move it all into extensions.js. * Animometer/resources/extensions.js: * Animometer/tests/resources/utilities.js: Removed. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: Remove script link. * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: Ditto. * Animometer/tests/examples/canvas-electrons.html: Ditto. * Animometer/tests/examples/canvas-stars.html: Ditto. * Animometer/tests/misc/compositing-transforms.html: Ditto. * Animometer/tests/simple/simple-canvas-paths.html: Ditto. * Animometer/tests/template/template-canvas.html: Ditto. * Animometer/tests/template/template-css.html: Ditto. * Animometer/tests/template/template-svg.html: Ditto. * Animometer/tests/text/layering-text.html: Ditto. Split tests.js into two. Add a new suite to runner/tests.js. * Animometer/developer.html: Update the script order. Scripts from debug-runner/ will always build on those from runner/, and have the same name. * Animometer/resources/debug-runner/tests.js: Move "complex examples" suite into "miscellaneous tests". (Suite): Deleted. (Suite.prototype.prepare): Deleted. (Suite.prototype.run): Deleted. (suiteFromName): Deleted. * Animometer/resources/runner/tests.js: Added. Take definitions and functions needed by the test harness. Leave the test suites behind. (Suite): Moved from debug script. (Suite.prototype.prepare): Ditto. (Suite.prototype.run): Ditto. (suiteFromName): Ditto. (testFromName): Ditto. Move benchmark resources out into resources/debug-runner, and update URLs. * Animometer/developer.html: Renamed from PerformanceTests/Animometer/runner/animometer.html. * Animometer/resources/debug-runner/animometer.css: Renamed from PerformanceTests/Animometer/runner/resources/animometer.css. * Animometer/resources/debug-runner/animometer.js: Renamed from PerformanceTests/Animometer/runner/resources/animometer.js. * Animometer/resources/debug-runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/runner/resources/benchmark-runner.js. * Animometer/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/runner/resources/d3.min.js. * Animometer/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/runner/resources/graph.js. * Animometer/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/runner/resources/tests.js. Canonical link: https://commits.webkit.org/170645@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@194407 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2015-12-24 01:31:47 +00:00
<script src="resources/strings.js"></script>
<script src="resources/extensions.js"></script>
Update animation benchmark and tests https://bugs.webkit.org/show_bug.cgi?id=154673 Reviewed by Dean Jackson. Update the ramp controller. The controller refines the complexity interval to test across. * Animometer/resources/statistics.js: Add functions that estimate cumulative distribution function. (Regression): For the flat regression, force the first segment to be at 60 fps. (valueAt): Add convenience function to return interpolated value based on the regression used. (_calculateRegression): Include the number of points included for both segments, and the piecewise errors. * Animometer/tests/resources/math.js: Make the Kalman estimator subclass Experiment, and allow it to be reset. * Animometer/tests/resources/main.js: Initialize the tier such that it starts at 10^0 = 1. Increase the number of ramps. Maintain three FPS thresholds-- the frame rate of interest, a limit on the lowest FPS we care to go for later interpolation, and a minimum FPS threshold we want to aim for each ramp. Also keep three estimators: a running average of the change point, a minimum boundary for each ramp, and an estimator for all the frames within an interval. The first two are used to determine the parameters of the next ramp, and the latter allows us to refine the parameters. (update): During the tier phase, it is possible that the highest complexity possible for a test won't stress the system enough to trigger stopping the tier phase and transitioning to the ramps. If the complexity doesn't change when going to the next tier, we've maxed the test out, and move on. When the tier phase completed, turn off Controller.frameLengthEstimator, which estimates the FPS at each tier. (tune): At each interval, look at the confidence distribution of being on the 60 FPS side or the slow side. If the slowest FPS we achieve at the ramp's maximum complexity is not at least _fpsRampSlowThreshold, then increase the maximum complexity. If we ever achieve 60 FPS, increase the ramp's minimum complexity to that level. If, at an even lower complexity, a glitch causes the FPS to drop, we reset the minimum complexity. Have the bootstrap calculation occur between tests. Clean up harness. * Animometer/resources/debug-runner/animometer.js: Run bootstrap after a test has completed to avoid doing all of it at the end before showing the results. Clean up parameters being passed around. * Animometer/resources/debug-runner/tests.js: (text): * Animometer/resources/runner/animometer.js: (this._processData.calculateScore): Save the results to the same object holding the data. (this._processData._processData): In the case where a file is dragged, calculate the score serially. Grab the results object and move it to the results variable and remove it from the data object. This avoids serializing the results into the JSON. (this._processData.findRegression): Include the samples used for bootstrapping. Reduce the resample size to shorten the wait. * Animometer/resources/runner/benchmark-runner.js: * Animometer/resources/statistics.js: (bootstrap): Update how bootstrapData is sorted. In some regression results the mix of floats and integers causes an alphabetical sort to occur. * Animometer/resources/strings.js: Add meta charset so that encodings between harness and test match. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: * Animometer/tests/bouncing-particles/bouncing-css-images.html: * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: * Animometer/tests/bouncing-particles/bouncing-svg-images.html: * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: * Animometer/tests/master/canvas-stage.html: * Animometer/tests/master/focus.html: * Animometer/tests/master/image-data.html: * Animometer/tests/master/multiply.html: * Animometer/tests/master/particles.html: * Animometer/tests/misc/canvas-electrons.html: * Animometer/tests/misc/canvas-stars.html: * Animometer/tests/misc/compositing-transforms.html: * Animometer/tests/simple/simple-canvas-paths.html: * Animometer/tests/simple/tiled-canvas-image.html: * Animometer/tests/template/template-canvas.html: * Animometer/tests/template/template-css.html: * Animometer/tests/template/template-svg.html: * Animometer/tests/text/layering-text.html: * Animometer/tests/text/text-boxes.html: Update test harness reporting. * Animometer/developer.html: Add missing meta charset. * Animometer/index.html: Remove unnecessary utf-8 declaration. * Animometer/resources/debug-runner/animometer.css: Add convenience classes for formatting the results table. * Animometer/resources/debug-runner/animometer.js: Adjust which stats are shown. * Animometer/resources/debug-runner/tests.js: Display bootstrapping statistics. * Animometer/resources/strings.js: Move strings not used by the release harness. Switch to a pseudo-random number generator. * Animometer/resources/statistics.js: Add a Pseudo class, with a simple pseudo-random number generator. (_calculateRegression): Reset the generator before running bootstrap. (bootstrap): Deleted. Replace Math.random with Pseudo.random. * Animometer/tests/master/resources/canvas-tests.js: * Animometer/tests/master/resources/focus.js: * Animometer/tests/master/resources/particles.js: * Animometer/tests/resources/main.js: Use bootstrapping to get confidence interval in the breakpoint. For the ramp controller, calculate the piecewise regression, and then use bootstrapping in order to find the 95% confidence interval. Use the raw data. * Animometer/developer.html: Default to the complexity graph. Add a legend checkbox to toggle visibility of the bootstrap score and histogram. * Animometer/resources/debug-runner/animometer.css: Make some more space to show the old raw and average scores in the legend. Add new styles for the data. * Animometer/resources/debug-runner/graph.js: (_addRegressionLine): Allow passing an array for the variance bar tied to the regression line. Now |stdev| is |range|. (createComplexityGraph): Add bootstrap median, and overlay a histogram of the bootstrap samples. Switch raw samples from circles to X's. (onComplexityGraphOptionsChanged): Allow toggling of the bootstrap data. (onGraphTypeChanged): Move the regressions for the raw and average samples to the legend. In the subtitle use the bootstrap median, and include the 95% confidence interval. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Factor out the code that determines which samples to include when calculating the piecewise regression. For series that have many samples, or a wider range of recorded complexities, throw away the 2.5% lowest and highest samples before calculating the regression. Keep all samples if the number of samples to regress is small or the range of complexities is narrow. (this._processData._calculateScore): Factor out regression calculation to findRegression(). Bootstrap the change point of the regression. The score is the median. * Animometer/resources/statistics.js: (_calculateRegression): Correct an issue in the calculation of the regression, where the denominator can be 0. (bootstrap): Template for bootstrapping. Create a bootstrap sample array, Create re-samples by random selection with replacement. Return the 95% confidence samples, the bootstrap median, mean, and the data itself. * Animometer/resources/strings.js: Add bootstrap. * Animometer/tests/resources/main.js: (processSamples): Don't prematurely cut the sample data. Fix graph drawing. * Animometer/resources/debug-runner/animometer.js: Add spacing in the JSON output. Multiple tests output a lot of JSON and can hang when selecting JSON with no whitespace. * Animometer/resources/debug-runner/animometer.css: (#complexity-graph .series.raw circle): Update the color. * Animometer/resources/debug-runner/graph.js: Use the FPS axis instead of the complexity axis, which can vary in domain. For determining the complexity domain, only use samples after samplingTimeOffset. Allow dropping results JSON. * Animometer/developer.html: Add a button. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Read the data and go straight to the dashboard. With JSON output, write out only the options and the raw data. Teach the harness to evaluate the samples and determine the test score. This will allow us to update how the score is calculated separately from the samples recorded. This also prepares the harness to be able to accept JSON of prior runs. * Animometer/resources/strings.js: Clean up and remove unneeded strings and reduce some of the hierarchy. * Animometer/resources/debug-runner/tests.js: Update to use the new strings. * Animometer/tests/resources/main.js: Allow all controllers to show a complexity-FPS graph. (_processComplexitySamples): Factor out some of the sample processing done in the ramp controller for the benefit of the other controllers. |complexitySamples| contains a list of samples. Sort the samples by complexity. Optionally remove the top x% of samples. Group them, and calculate distribution of samples within the same complexity, and add those as new entries into |complexityAverageSamples|. (Controller.processSamples): Move the code responsible for determining the complexity and FPS scores out to ResultsDashboard. The structure of the data returned by the controller is: { controller: [time-regression, time-regression, ...], // optional, data specific to controller marks: [...], samples: { // all of the sample data controller: [...], complexity: [...], // processed from controller samples complexityAverage: [...], // processed from complexity samples } } (AdaptiveController.processSamples): Adding the target frame length is no longer necessary; we now pass the test options to the graph. (Regression): Move to statistics.js. * Animometer/resources/statistics.js: Move Regression to here. Add a check if the sampling range only contains one sample, since we cannot calculate a regression from one sample point. Teach the test harness to evaluate the data. * Animometer/resources/runner/animometer.js: (ResultsDashboard): Store the options used to run the test and the computed results/score separately from the data. The results are stored as: { score: /* geomean of iteration score */, iterationsResults: [ { score: /* geomean of tests */, testsResults: { suiteName: { testName: { controller: { average: concern: stdev: percent: }, frameLength: { ... }, complexity: { complexity: stdev: segment1: segment2: }, complexityAverage: { ... } }, testName: { ... }, }, ... next suite ... } }, { ...next iteration... } ] } * Animometer/resources/debug-runner/animometer.js: Pass options around instead of relying on what was selected in the form. This will later allow for dropping previous results, and using those runs' options when calculating scores. (ResultsTable._addGraphButton): Simplify button action by using attached test data. * Animometer/resources/debug-runner/graph.js: Refactor to use the data. Consolidate JS files, and move statistics out to a separate JS. Preparation for having the Controller only handle recording and storage of the samples, and leave the evaluation of the test score out to the harness. Move Experiment to a new statistics.js, where Regression will also eventually go. Get rid of algorithm.js and move it to utilities.js since the Heap is used only for Experiments. * Animometer/tests/resources/algorithm.js: Removed. Heap is in utilities.js. * Animometer/tests/resources/sampler.js: Removed. Experiment is in statistics.js, Sampler in main.js. * Animometer/tests/resources/main.js: Move Sampler here. * Animometer/resources/statistics.js: Added. Move Statistics and Experiment here. * Animometer/resources/extensions.js: Move Heap here. Attach static method to create a max or min heap to Heap, instead of a new Algorithm object. Update JS files. * Animometer/developer.html: * Animometer/index.html: * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: * Animometer/tests/bouncing-particles/bouncing-css-images.html: * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: * Animometer/tests/bouncing-particles/bouncing-svg-images.html: * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: * Animometer/tests/master/canvas-stage.html: * Animometer/tests/master/focus.html: * Animometer/tests/master/image-data.html: * Animometer/tests/master/multiply.html: * Animometer/tests/master/particles.html: * Animometer/tests/misc/canvas-electrons.html: * Animometer/tests/misc/canvas-stars.html: * Animometer/tests/misc/compositing-transforms.html: * Animometer/tests/simple/simple-canvas-paths.html: * Animometer/tests/simple/tiled-canvas-image.html: * Animometer/tests/template/template-canvas.html: * Animometer/tests/template/template-css.html: * Animometer/tests/template/template-svg.html: * Animometer/tests/text/layering-text.html: * Animometer/tests/text/text-boxes.html: Fix the cursor in the graph analysis when the min complexity is not 0. * Animometer/resources/debug-runner/graph.js: (_addRegression): (createComplexityGraph): Canonical link: https://commits.webkit.org/172863@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@197229 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-02-27 04:32:09 +00:00
<script src="resources/statistics.js"></script>
Split benchmark into two different pages https://bugs.webkit.org/show_bug.cgi?id=152458 Reviewed by Simon Fraser. Address comments. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): When the testing is complete the frame owning the sampler goes away, and a later call to get the JSON data is no longer available. Process the data right here, instead, and just reference it when displaying the results in ResultsDashboard.prototype._processData. * Animometer/resources/extensions.js: (Array.prototype.fill.Array.prototype.fill): Add a null check. Remove braces around single-line clause. (Array.prototype.find.Array.prototype.find): Update the null check. (ResultsDashboard.prototype._processData): Use the already-processed data. * Animometer/resources/runner/animometer.css: (.frame-container > iframe): Remove calc(). Move Array functions to extensions.js since that is included by the harness. Add ES6 Array polyfills. * Animometer/resources/algorithm.js: (Array.prototype.swap): Moved to extensions.js. * Animometer/resources/extensions.js: (Array.prototype.swap): (Array.prototype.fill): Added. (Array.prototype.find): Added. Adjust styles for iPad. * Animometer/resources/runner/animometer.css: (@media screen and (min-device-width: 768px)): Apply to iPad as well. (@media screen and (max-device-width: 1024px)): Update width for iPads. Adjustment styles for iOS. * Animometer/developer.html: Different divs contain the iframe, so use a class instead and update the style rules. * Animometer/index.html: * Animometer/resources/debug-runner/animometer.css: Remove extraneous rules. (@media screen and (min-device-width: 1800px)): Move this up. * Animometer/resources/runner/animometer.css: Add rules to accomodate iOS. Get rid of prefixed flex properties for now. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/runner/animometer.css: Update the structure of the harness. Remove the JSON-per-test but keep the JSON of the whole test run. Use the full page in order to display the graph. * Animometer/developer.html: Update several of the JS file includes to UTF-8. Remove header and footer. Test results screen includes score, average, and worst 5% statistics. * Animometer/index.html: Make structure similar to developer.html. * Animometer/resources/debug-runner/animometer.css: Remove most of the button rules since they are superfluous. Move the progress bar to the top, fixed. Update the results page rules. * Animometer/resources/debug-runner/animometer.js: Remove most of the additions to sectionsManager since they are no longer needed. (setSectionHeader): Updates header of the section. (window.suitesManager._updateStartButtonState): Update selector. (showResults): Add keypress event for selecting different data for copy/paste. Update how the results are populated. Include full test JSON in a textarea, rather than requiring a button press. (showTestGraph): * Animometer/resources/debug-runner/tests.js: Update structure of Headers. Define different kinds of headers. Headers can control their title, and the text used as the cell contents, including class name. * Animometer/resources/extensions.js: (ResultsTable): Update to include a flattened version of the headers, used while populating table contents. Remove unneeded helper functions for creating the table. Rename "show" to "add". * Animometer/resources/runner/animometer.css: Update rules to accommodate the new structure. * Animometer/resources/runner/animometer.js: (window.sectionsManager.setSectionScore): Helper function to set the score and mean for a section. (window.sectionsManager.populateTable): Helper function to set the table. (window.benchmarkController.showResults): Refactor. (window.benchmarkController.selectResults): Update selectors. * Animometer/resources/runner/tests.js: Set Headers. Debug harness extends it. Update debug runner to have similar names to the basic runner. Include that page's CSS and remove extraneous CSS rules. Get rid of the statistics table #record. * Animometer/developer.html: Rename #home to #intro. Rename .spacer to hr. * Animometer/resources/debug-runner/animometer.css: Set to flexbox when selected. * Animometer/resources/debug-runner/animometer.js: Remove recordTable. (window.suitesManager._updateStartButtonState): Update selector to #intro. (setupRunningSectionStyle): Deleted. * Animometer/resources/runner/animometer.css: (#test-container.selected): Change to flex-box only when visible. Remove recordTable. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): * Animometer/resources/runner/tests.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * Animometer/tests/examples/resources/canvas-electrons.js: * Animometer/tests/examples/resources/canvas-stars.js: * Animometer/tests/misc/resources/compositing-transforms.js: * Animometer/tests/resources/main.js: * Animometer/tests/resources/stage.js: (StageBenchmark): Remove _recordTable. * Animometer/tests/simple/resources/simple-canvas-paths.js: * Animometer/tests/simple/resources/simple-canvas.js: * Animometer/tests/template/resources/template-canvas.js: * Animometer/tests/template/resources/template-css.js: * Animometer/tests/template/resources/template-svg.js: * Animometer/tests/text/resources/layering-text.js: * Animometer/resources/debug-runner/animometer.js: (willStartFirstIteration): Fix selector, since results-table is used in multiple places, so it cannot be an id. Make it possible to select the scores, or the whole table data, by cycling through different selections through key press of 's'. * Animometer/resources/runner/animometer.js: (window.benchmarkController.showResults): Attach a keypress handler if it hasn't been added already. (window.benchmarkController.selectResults): * Animometer/resources/runner/tests.js: Cycle through different ranges. Fix a few fly-by errors. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunnerState.prototype.prepareCurrentTest): Update the frame relative path since the files are now in the top directory instead of inside runner/. (BenchmarkRunner.prototype._runTestAndRecordResults): Incorrect reference to function. (BenchmarkRunner.prototype.step): Member variable is never used. A little stylistic cleanup. * Animometer/resources/debug-runner/benchmark-runner.js: * Animometer/resources/extensions.js: (window.DocumentExtension.createElement): * Animometer/tests/resources/main.js: (Benchmark.prototype.record): * Animometer/tests/resources/stage.js: (StageBenchmark.prototype.showResults): Reverse progress and message. The message appears less frequently than the progress. * Animometer/tests/simple/resources/simple-canvas.js: (SimpleCanvasBenchmark): Remove unused options. Add newer version of harness in a new page. Consolidate differences between the two pages. * Animometer/developer.html: Include runner/animometer.js. Rename the JS function to run the benchmark to startBenchmark() instead of startTest(). Rename #running to #test-container. * Animometer/index.html: Added. Similarly calls startBenchmark() and has #test-container. * Animometer/resources/debug-runner/animometer.css: Make the canvas 2:1 (1200px x 800px) instead of 4:3. Split out benchmarkRunnerClient and benchmarkController. * Animometer/resources/debug-runner/animometer.js: Move needed functions out of benchmarkRunnerClient, and leave the rest here to extend that object. Get rid of _resultsTable and move populating the results table into benchmarkController. Rename _resultsDashboard to results and make it accessible for other objects to use. (willAddTestFrame): This is unnecessary. Remove. (window.sectionsManager.showScore): Grab it from the results object instead of benchmarkRunnerClient. (window.sectionsManager.showSection): Deleted. Moved to runner/animometer.js. (window.benchmarkController._runBenchmark): Deleted. Mostly moved into _startBenchmark. (window.benchmarkController.startBenchmark): Refactor to call _startBenchmark. (window.benchmarkController.showResults): Include most of benchmarkRunnerClient.didFinishLastIteration() here. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._appendFrame): Remove unneeded call to willAddTestFrame. * Animometer/resources/extensions.js: (ResultsDashboard): Change the class to process the sampler data on-demand and hold onto that data for later referencing. (ResultsDashboard.prototype.toJSON): Deleted. (ResultsDashboard.prototype._processData): Rename toJSON to _processData since it's not really outputting JSON. Store the processed data into a member variable that can be referenced later. (ResultsDashboard.prototype.get data): Process the data if it hasn't already. (ResultsDashboard.prototype.get score): Process the data if it hasn't already, then return the aggregate score. (ResultsTable.prototype._showHeader): When outputting the results to a table, don't force the need for an empty children array. This was to allow for a header row in the table that spanned multiple columns. In the simpler harness, this is not needed. (ResultsTable.prototype._showEmptyCells): (ResultsTable.prototype._showTest): This hardcoded the columns. At least for the name and score, which is the bare minimum needed for the simpler harness, key off of the header name provided. * Animometer/resources/runner/animometer.css: Added. Use a similar 2:1 ratio. The score tables are split into the data and the headers, and are also displayed RTL so that a later patch allows a user to copy-paste the data easily. * Animometer/resources/runner/animometer.js: Added. Use a simpler version of benchmarkRunnerClient. The debug harness will extend these classes. (window.benchmarkController._startBenchmark): Used by both harnesses. (window.benchmarkController.startBenchmark): Set hard-coded options. (window.benchmarkController.showResults): Includes most of benchmarkRunnerClient.didFinishLastIteration() here. Get rid of utilities.js. Move it all into extensions.js. * Animometer/resources/extensions.js: * Animometer/tests/resources/utilities.js: Removed. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: Remove script link. * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: Ditto. * Animometer/tests/examples/canvas-electrons.html: Ditto. * Animometer/tests/examples/canvas-stars.html: Ditto. * Animometer/tests/misc/compositing-transforms.html: Ditto. * Animometer/tests/simple/simple-canvas-paths.html: Ditto. * Animometer/tests/template/template-canvas.html: Ditto. * Animometer/tests/template/template-css.html: Ditto. * Animometer/tests/template/template-svg.html: Ditto. * Animometer/tests/text/layering-text.html: Ditto. Split tests.js into two. Add a new suite to runner/tests.js. * Animometer/developer.html: Update the script order. Scripts from debug-runner/ will always build on those from runner/, and have the same name. * Animometer/resources/debug-runner/tests.js: Move "complex examples" suite into "miscellaneous tests". (Suite): Deleted. (Suite.prototype.prepare): Deleted. (Suite.prototype.run): Deleted. (suiteFromName): Deleted. * Animometer/resources/runner/tests.js: Added. Take definitions and functions needed by the test harness. Leave the test suites behind. (Suite): Moved from debug script. (Suite.prototype.prepare): Ditto. (Suite.prototype.run): Ditto. (suiteFromName): Ditto. (testFromName): Ditto. Move benchmark resources out into resources/debug-runner, and update URLs. * Animometer/developer.html: Renamed from PerformanceTests/Animometer/runner/animometer.html. * Animometer/resources/debug-runner/animometer.css: Renamed from PerformanceTests/Animometer/runner/resources/animometer.css. * Animometer/resources/debug-runner/animometer.js: Renamed from PerformanceTests/Animometer/runner/resources/animometer.js. * Animometer/resources/debug-runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/runner/resources/benchmark-runner.js. * Animometer/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/runner/resources/d3.min.js. * Animometer/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/runner/resources/graph.js. * Animometer/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/runner/resources/tests.js. Canonical link: https://commits.webkit.org/170645@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@194407 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2015-12-24 01:31:47 +00:00
Improvements to Animometer benchmark https://bugs.webkit.org/show_bug.cgi?id=157738 Reviewed by Dean Jackson. Provisionally reviewed by Said Abou-Hallawa. Include confidence interval for the final score, and store the canvas size in the serialization so that it is accurately shown in results. * Animometer/developer.html: Add a "confidence" div. * Animometer/index.html: Ditto. Convert "mean" to "confidence". * Animometer/resources/debug-runner/animometer.js: Look at options, and if the configuration is included, update the body class based on it (similar to what we do when we first load the page). That way, if you drag-and-drop previous results in, that configuration is reflected in the dashboard. Show the full confidence interval. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Style update. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: (_processData): Propagate the confidence interval values out and calculate the lower and upper bounds. For now, shortcut the aggregate calculation, since we only go through one iteration. (this._processData.calculateScore): Propagate the confidence interval out to be next to the score. Depending on the controller these values are calculated differently. (this._processData._getResultsProperty): Convenience function. (this._processData.get score): Refactor. (this._processData.get scoreLowerBound): Get the aggregate lower bound. (this._processData.get scoreUpperBound): Get the aggregate upper bound. (window.sectionsManager.setSectionScore): (window.benchmarkController._startBenchmark): When the benchmark starts, note the canvas size and add it to options. That way it will be automatically be serialized. (window.benchmarkController.showResults): Include the maximum deviation percentage. * Animometer/resources/runner/lines.svg: Make the background line up with the skew. * Animometer/resources/runner/tests.js: (Headers.details.text): Refactor. * Animometer/resources/statistics.js: (Statistics.largestDeviationPercentage): Convenience function to calculate the deviation percentage on either end and return the largest deviation. * Animometer/resources/strings.js: Allow specifying a regression profile to use instead of taking the one with the lowest error. Address an issue in the focus test when the regression calculated ends up overestimating the change point, causing a cascade of tougher ramps. The reason behind this is that at max complexity of an initial ramp, the frame length is very high, and it influences the second segment of the piecewise regression strongly, causing it to be very steep. As a result, the first segment, expected to be flat, ends up covering a higher range of complexity. That makes the change point much higher than it should be. To avoid this, we will add a sanity check on the maximum value of the ramp. If the regression's projected value at the maximum complexity of the current ramp is very slow (less than 20 fps), 1) reduce the maximum complexity by 20%, and 2) do not include the regression's change point in the change point estimator. That estimator is used as the midpoint of the next ramp, and including the change point from a poor regression can bake in the error. The controller already knows how to adjust for ramps that are too easy for the system. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Takes a preferred profile and gives that to Regression. (this._processData.calculateScore): With the ramp controller, take the profile of the ramp that was used the most when calculating the ramp's regression. That profile is what is used for the test's score. * Animometer/resources/statistics.js: (Regression.Utilities.createClass): Update to take an options object which can specify a profile to calculate with. Otherwise it will continue to use both and select the one with the lower error. (_calculateRegression): Fix an issue where we claim 0 error if the regression calculation fails due to divide-by-zero. Instead reject that regression calculation by giving it Number.MAX_VALUE. * Animometer/resources/strings.js: New strings for marking a regression as flat or slope. * Animometer/tests/resources/main.js: (RampController): Rename the thresholds for clarity. Add a threshold that, if exceeded, will lower the maximum complexity of the next ramp. (tune): Relax the cdf check to consider whether the interval definitely falls in the desired frame length threshold. (processSamples): Include the profile in the ramp. Update ramp controller test. Increase the length of the test to 30 seconds, and extend the interval to 120 ms during sampling. Improve the estimation of the ramp parameters. * Animometer/developer.html: Change default to 30 seconds, and don't show the progress bar by default. * Animometer/resources/runner/animometer.js: Change default to 30 seconds. * Animometer/tests/resources/main.js: A number of improvements to the ramp controller, in the order in which they appear in the patch: - With a longer test length use longer ramps with longer intervals to get more data at each complexity. Keep the 100 ms interval length during the ramp up phase since we don't need to spend more time there to find the right order of magnitude, but increase it during the ramps to 120 ms. - The ramp linearly interpolates the complexity to render based on its timestamp, but it would never sample the minimum complexity. Instead of lerping max to min complexity from time 0 to t where t is the ramp length, instead lerp from 0 to (t - intervalSampleLength) so that we can have at least one interval sample at the min complexity for that ramp. - Some regression calculations only come out with one line segment rather than the two we expect. This could be due to a noisy ramp or the ramp's range is too narrow. If that's the case, influence the minimum complexity of the next ramp towards the lowest bound of 1, so that we ensure that at least part of the ramp is covering a complexity range that the system can handle at full 60. - Remove an assignment to interpolatedFrameLength since that is never subsequently used. Update the format used to serialize the results for analysis. Each data point used to have named properties for fields like complexity and frame rate. In addition the serialized numbers had rounding errors that took up many characters. Update the format by introducing a new data container called SampleData, which contains a field map. The map maps a string to an array index. Each data point is an array, so, to get a stat, use the field map to get the array index into the data point. This allows future versions to track other data, and reduces the size of the output string by two-thirds. * Animometer/resources/extensions.js: (Utilities.toFixedNumber): Add convenience function that truncates the number to a fixed precision, and converts it back to a number. (SampleData): New class that contains sample data and a field map that maps properties to an array index. (get length): Number of data points. (addField): Add a field to the field map. (push): Add a data point. (sort): Sort the data. (slice): Return new SampleData object with sliced data. (forEach): Iterate over the data with provided function. (createDatum): (getFieldInDatum): Returns the data point associated with the field name by looking it up in the field map in the datum provided, which can be the datum object itself (an array) or an index into the data member variable. (setFieldInDatum): Sets the data point associated with the field name. (at): Returns the data point at the provided index. (toArray): Serializes the data where the field map serves as property names for each point. * Animometer/resources/debug-runner/graph.js: (updateGraphData): Remove unused _testData. Convert the data to the old array format for the graph to use, since the old format was much easier to work with when displaying the graphs. (onGraphTypeChanged): For some controllers, no alternative score or mean is provided. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: Refactor to use SampleData. Update JSON output to only go to 3 digits of precision for purposes of reducing the data size. * Animometer/resources/strings.js: Add new strings to put into the field maps. * Animometer/tests/resources/main.js: Refactor to use SampleData. * Animometer/developer.html: * Animometer/index.html: Restructure results table for both pages. Add charset attribute to tests.js include. * Animometer/resources/debug-runner/animometer.css: Clear out styles from release runner. * Animometer/resources/debug-runner/graph.js: (onGraphTypeChanged): Update score and mean if bootstrap results are available from the controller, since not all controllers do bootstrapping. * Animometer/resources/debug-runner/tests.js: Update header text. * Animometer/resources/runner/animometer.css: Include confidence interval in results. * Animometer/resources/runner/animometer.js: (ResultsTable._addHeader): Header contents can be HTML, so use innerHTML instead. (ResultsTable._addBody): Add tbody element. (ResultsTable._addTest): Allow a data cell to invoke a JS function to get its contents. (window.benchmarkController.showResults): Add table that includes tests' confidence intervals. * Animometer/resources/runner/tests.js: (Headers.details.text): Add new details table that includes bootstrap confidence interval. The interval can be asymmetric, but for simplicity, report the maximum deviation percentage on either side of the bootstrap median. * Animometer/resources/statistics.js: (bootstrap): Include the confidence percentage in the return object. Report canvas size in results. * Animometer/developer.html: Add markup to indicate whether a small, medium, or large canvas was used. * Animometer/index.html: Ditto. * Animometer/resources/debug-runner/animometer.js: Call determineCanvasSize(). * Animometer/resources/runner/animometer.css: Update styles to set the canvas based on the body class size. * Animometer/resources/runner/animometer.js: (window.benchmarkController.initialize): Update styles to set the canvas based on the body class size. (window.benchmarkController.determineCanvasSize): Run various media queries and set the body class based on the size of the device. * Animometer/developer.html: Refactor to include the main CSS file, and redo the layout so that it doesn't rely on flexbox. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: (updateDisplay): Since various parts of the script alter the body class, we can't replace the className directly. Instead, remove all display-based values and then add the one that was selected. * Animometer/resources/debug-runner/graph.js: (updateGraphData): To set the size of the graph, use window.innerHeight. * Animometer/resources/runner/animometer.js: (window.sectionsManager.showSection): Since various parts of the script alter the body class, we can't replace the className directly. Remove all of the section classes individually and then add the one desired. * Animometer/tests/resources/stage.css: Remove -apple-system as a font to use in the stage. Canonical link: https://commits.webkit.org/177081@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@202314 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-06-22 03:11:15 +00:00
<script src="resources/runner/tests.js" charset="utf-8"></script>
Split benchmark into two different pages https://bugs.webkit.org/show_bug.cgi?id=152458 Reviewed by Simon Fraser. Address comments. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): When the testing is complete the frame owning the sampler goes away, and a later call to get the JSON data is no longer available. Process the data right here, instead, and just reference it when displaying the results in ResultsDashboard.prototype._processData. * Animometer/resources/extensions.js: (Array.prototype.fill.Array.prototype.fill): Add a null check. Remove braces around single-line clause. (Array.prototype.find.Array.prototype.find): Update the null check. (ResultsDashboard.prototype._processData): Use the already-processed data. * Animometer/resources/runner/animometer.css: (.frame-container > iframe): Remove calc(). Move Array functions to extensions.js since that is included by the harness. Add ES6 Array polyfills. * Animometer/resources/algorithm.js: (Array.prototype.swap): Moved to extensions.js. * Animometer/resources/extensions.js: (Array.prototype.swap): (Array.prototype.fill): Added. (Array.prototype.find): Added. Adjust styles for iPad. * Animometer/resources/runner/animometer.css: (@media screen and (min-device-width: 768px)): Apply to iPad as well. (@media screen and (max-device-width: 1024px)): Update width for iPads. Adjustment styles for iOS. * Animometer/developer.html: Different divs contain the iframe, so use a class instead and update the style rules. * Animometer/index.html: * Animometer/resources/debug-runner/animometer.css: Remove extraneous rules. (@media screen and (min-device-width: 1800px)): Move this up. * Animometer/resources/runner/animometer.css: Add rules to accomodate iOS. Get rid of prefixed flex properties for now. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/runner/animometer.css: Update the structure of the harness. Remove the JSON-per-test but keep the JSON of the whole test run. Use the full page in order to display the graph. * Animometer/developer.html: Update several of the JS file includes to UTF-8. Remove header and footer. Test results screen includes score, average, and worst 5% statistics. * Animometer/index.html: Make structure similar to developer.html. * Animometer/resources/debug-runner/animometer.css: Remove most of the button rules since they are superfluous. Move the progress bar to the top, fixed. Update the results page rules. * Animometer/resources/debug-runner/animometer.js: Remove most of the additions to sectionsManager since they are no longer needed. (setSectionHeader): Updates header of the section. (window.suitesManager._updateStartButtonState): Update selector. (showResults): Add keypress event for selecting different data for copy/paste. Update how the results are populated. Include full test JSON in a textarea, rather than requiring a button press. (showTestGraph): * Animometer/resources/debug-runner/tests.js: Update structure of Headers. Define different kinds of headers. Headers can control their title, and the text used as the cell contents, including class name. * Animometer/resources/extensions.js: (ResultsTable): Update to include a flattened version of the headers, used while populating table contents. Remove unneeded helper functions for creating the table. Rename "show" to "add". * Animometer/resources/runner/animometer.css: Update rules to accommodate the new structure. * Animometer/resources/runner/animometer.js: (window.sectionsManager.setSectionScore): Helper function to set the score and mean for a section. (window.sectionsManager.populateTable): Helper function to set the table. (window.benchmarkController.showResults): Refactor. (window.benchmarkController.selectResults): Update selectors. * Animometer/resources/runner/tests.js: Set Headers. Debug harness extends it. Update debug runner to have similar names to the basic runner. Include that page's CSS and remove extraneous CSS rules. Get rid of the statistics table #record. * Animometer/developer.html: Rename #home to #intro. Rename .spacer to hr. * Animometer/resources/debug-runner/animometer.css: Set to flexbox when selected. * Animometer/resources/debug-runner/animometer.js: Remove recordTable. (window.suitesManager._updateStartButtonState): Update selector to #intro. (setupRunningSectionStyle): Deleted. * Animometer/resources/runner/animometer.css: (#test-container.selected): Change to flex-box only when visible. Remove recordTable. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): * Animometer/resources/runner/tests.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * Animometer/tests/examples/resources/canvas-electrons.js: * Animometer/tests/examples/resources/canvas-stars.js: * Animometer/tests/misc/resources/compositing-transforms.js: * Animometer/tests/resources/main.js: * Animometer/tests/resources/stage.js: (StageBenchmark): Remove _recordTable. * Animometer/tests/simple/resources/simple-canvas-paths.js: * Animometer/tests/simple/resources/simple-canvas.js: * Animometer/tests/template/resources/template-canvas.js: * Animometer/tests/template/resources/template-css.js: * Animometer/tests/template/resources/template-svg.js: * Animometer/tests/text/resources/layering-text.js: * Animometer/resources/debug-runner/animometer.js: (willStartFirstIteration): Fix selector, since results-table is used in multiple places, so it cannot be an id. Make it possible to select the scores, or the whole table data, by cycling through different selections through key press of 's'. * Animometer/resources/runner/animometer.js: (window.benchmarkController.showResults): Attach a keypress handler if it hasn't been added already. (window.benchmarkController.selectResults): * Animometer/resources/runner/tests.js: Cycle through different ranges. Fix a few fly-by errors. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunnerState.prototype.prepareCurrentTest): Update the frame relative path since the files are now in the top directory instead of inside runner/. (BenchmarkRunner.prototype._runTestAndRecordResults): Incorrect reference to function. (BenchmarkRunner.prototype.step): Member variable is never used. A little stylistic cleanup. * Animometer/resources/debug-runner/benchmark-runner.js: * Animometer/resources/extensions.js: (window.DocumentExtension.createElement): * Animometer/tests/resources/main.js: (Benchmark.prototype.record): * Animometer/tests/resources/stage.js: (StageBenchmark.prototype.showResults): Reverse progress and message. The message appears less frequently than the progress. * Animometer/tests/simple/resources/simple-canvas.js: (SimpleCanvasBenchmark): Remove unused options. Add newer version of harness in a new page. Consolidate differences between the two pages. * Animometer/developer.html: Include runner/animometer.js. Rename the JS function to run the benchmark to startBenchmark() instead of startTest(). Rename #running to #test-container. * Animometer/index.html: Added. Similarly calls startBenchmark() and has #test-container. * Animometer/resources/debug-runner/animometer.css: Make the canvas 2:1 (1200px x 800px) instead of 4:3. Split out benchmarkRunnerClient and benchmarkController. * Animometer/resources/debug-runner/animometer.js: Move needed functions out of benchmarkRunnerClient, and leave the rest here to extend that object. Get rid of _resultsTable and move populating the results table into benchmarkController. Rename _resultsDashboard to results and make it accessible for other objects to use. (willAddTestFrame): This is unnecessary. Remove. (window.sectionsManager.showScore): Grab it from the results object instead of benchmarkRunnerClient. (window.sectionsManager.showSection): Deleted. Moved to runner/animometer.js. (window.benchmarkController._runBenchmark): Deleted. Mostly moved into _startBenchmark. (window.benchmarkController.startBenchmark): Refactor to call _startBenchmark. (window.benchmarkController.showResults): Include most of benchmarkRunnerClient.didFinishLastIteration() here. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._appendFrame): Remove unneeded call to willAddTestFrame. * Animometer/resources/extensions.js: (ResultsDashboard): Change the class to process the sampler data on-demand and hold onto that data for later referencing. (ResultsDashboard.prototype.toJSON): Deleted. (ResultsDashboard.prototype._processData): Rename toJSON to _processData since it's not really outputting JSON. Store the processed data into a member variable that can be referenced later. (ResultsDashboard.prototype.get data): Process the data if it hasn't already. (ResultsDashboard.prototype.get score): Process the data if it hasn't already, then return the aggregate score. (ResultsTable.prototype._showHeader): When outputting the results to a table, don't force the need for an empty children array. This was to allow for a header row in the table that spanned multiple columns. In the simpler harness, this is not needed. (ResultsTable.prototype._showEmptyCells): (ResultsTable.prototype._showTest): This hardcoded the columns. At least for the name and score, which is the bare minimum needed for the simpler harness, key off of the header name provided. * Animometer/resources/runner/animometer.css: Added. Use a similar 2:1 ratio. The score tables are split into the data and the headers, and are also displayed RTL so that a later patch allows a user to copy-paste the data easily. * Animometer/resources/runner/animometer.js: Added. Use a simpler version of benchmarkRunnerClient. The debug harness will extend these classes. (window.benchmarkController._startBenchmark): Used by both harnesses. (window.benchmarkController.startBenchmark): Set hard-coded options. (window.benchmarkController.showResults): Includes most of benchmarkRunnerClient.didFinishLastIteration() here. Get rid of utilities.js. Move it all into extensions.js. * Animometer/resources/extensions.js: * Animometer/tests/resources/utilities.js: Removed. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: Remove script link. * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: Ditto. * Animometer/tests/examples/canvas-electrons.html: Ditto. * Animometer/tests/examples/canvas-stars.html: Ditto. * Animometer/tests/misc/compositing-transforms.html: Ditto. * Animometer/tests/simple/simple-canvas-paths.html: Ditto. * Animometer/tests/template/template-canvas.html: Ditto. * Animometer/tests/template/template-css.html: Ditto. * Animometer/tests/template/template-svg.html: Ditto. * Animometer/tests/text/layering-text.html: Ditto. Split tests.js into two. Add a new suite to runner/tests.js. * Animometer/developer.html: Update the script order. Scripts from debug-runner/ will always build on those from runner/, and have the same name. * Animometer/resources/debug-runner/tests.js: Move "complex examples" suite into "miscellaneous tests". (Suite): Deleted. (Suite.prototype.prepare): Deleted. (Suite.prototype.run): Deleted. (suiteFromName): Deleted. * Animometer/resources/runner/tests.js: Added. Take definitions and functions needed by the test harness. Leave the test suites behind. (Suite): Moved from debug script. (Suite.prototype.prepare): Ditto. (Suite.prototype.run): Ditto. (suiteFromName): Ditto. (testFromName): Ditto. Move benchmark resources out into resources/debug-runner, and update URLs. * Animometer/developer.html: Renamed from PerformanceTests/Animometer/runner/animometer.html. * Animometer/resources/debug-runner/animometer.css: Renamed from PerformanceTests/Animometer/runner/resources/animometer.css. * Animometer/resources/debug-runner/animometer.js: Renamed from PerformanceTests/Animometer/runner/resources/animometer.js. * Animometer/resources/debug-runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/runner/resources/benchmark-runner.js. * Animometer/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/runner/resources/d3.min.js. * Animometer/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/runner/resources/graph.js. * Animometer/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/runner/resources/tests.js. Canonical link: https://commits.webkit.org/170645@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@194407 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2015-12-24 01:31:47 +00:00
<script src="resources/debug-runner/tests.js" charset="utf-8"></script>
Rename the directory and the files of MotionMark from Animometer to MotionMark https://bugs.webkit.org/show_bug.cgi?id=166659 Patch by Said Abou-Hallawa <sabouhallawa@apple.com> on 2017-01-06 Reviewed by Jon Lee. Rename the directory and the files of the benchmark to its new name. * MotionMark/developer.html: Renamed from PerformanceTests/Animometer/developer.html. * MotionMark/index.html: Renamed from PerformanceTests/Animometer/index.html. * MotionMark/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/resources/debug-runner/d3.min.js. * MotionMark/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/resources/debug-runner/graph.js. * MotionMark/resources/debug-runner/motionmark.css: Renamed from PerformanceTests/Animometer/resources/debug-runner/animometer.css. * MotionMark/resources/debug-runner/motionmark.js: Renamed from PerformanceTests/Animometer/resources/debug-runner/animometer.js. * MotionMark/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/resources/debug-runner/tests.js. * MotionMark/resources/extensions.js: Renamed from PerformanceTests/Animometer/resources/extensions.js. * MotionMark/resources/runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/resources/runner/benchmark-runner.js. * MotionMark/resources/runner/crystal.svg: Renamed from PerformanceTests/Animometer/resources/runner/crystal.svg. * MotionMark/resources/runner/lines.svg: Renamed from PerformanceTests/Animometer/resources/runner/lines.svg. * MotionMark/resources/runner/logo.svg: Renamed from PerformanceTests/Animometer/resources/runner/logo.svg. * MotionMark/resources/runner/motionmark.css: Renamed from PerformanceTests/Animometer/resources/runner/animometer.css. * MotionMark/resources/runner/motionmark.js: Renamed from PerformanceTests/Animometer/resources/runner/animometer.js. * MotionMark/resources/runner/tests.js: Renamed from PerformanceTests/Animometer/resources/runner/tests.js. * MotionMark/resources/statistics.js: Renamed from PerformanceTests/Animometer/resources/statistics.js. * MotionMark/resources/strings.js: Renamed from PerformanceTests/Animometer/resources/strings.js. * MotionMark/tests/3d/resources/webgl.js: Renamed from PerformanceTests/Animometer/tests/3d/resources/webgl.js. * MotionMark/tests/3d/webgl.html: Renamed from PerformanceTests/Animometer/tests/3d/webgl.html. * MotionMark/tests/bouncing-particles/bouncing-canvas-images.html: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/bouncing-canvas-images.html. * MotionMark/tests/bouncing-particles/bouncing-canvas-shapes.html: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html. * MotionMark/tests/bouncing-particles/bouncing-css-images.html: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/bouncing-css-images.html. * MotionMark/tests/bouncing-particles/bouncing-css-shapes.html: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/bouncing-css-shapes.html. * MotionMark/tests/bouncing-particles/bouncing-svg-images.html: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/bouncing-svg-images.html. * MotionMark/tests/bouncing-particles/bouncing-svg-shapes.html: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/bouncing-svg-shapes.html. * MotionMark/tests/bouncing-particles/bouncing-tagged-images.html: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/bouncing-tagged-images.html. * MotionMark/tests/bouncing-particles/resources/bouncing-canvas-images.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js. * MotionMark/tests/bouncing-particles/resources/bouncing-canvas-particles.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js. * MotionMark/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js. * MotionMark/tests/bouncing-particles/resources/bouncing-css-images.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-css-images.js. * MotionMark/tests/bouncing-particles/resources/bouncing-css-shapes.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js. * MotionMark/tests/bouncing-particles/resources/bouncing-svg-images.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js. * MotionMark/tests/bouncing-particles/resources/bouncing-svg-particles.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-svg-particles.js. * MotionMark/tests/bouncing-particles/resources/bouncing-svg-shapes.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js. * MotionMark/tests/bouncing-particles/resources/bouncing-tagged-images.js: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/bouncing-tagged-images.js. * MotionMark/tests/bouncing-particles/resources/image1.jpg: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/image1.jpg. * MotionMark/tests/bouncing-particles/resources/image2.jpg: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/image2.jpg. * MotionMark/tests/bouncing-particles/resources/image3.jpg: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/image3.jpg. * MotionMark/tests/bouncing-particles/resources/image4.jpg: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/image4.jpg. * MotionMark/tests/bouncing-particles/resources/image5.jpg: Renamed from PerformanceTests/Animometer/tests/bouncing-particles/resources/image5.jpg. * MotionMark/tests/dom/compositing-transforms.html: Renamed from PerformanceTests/Animometer/tests/dom/compositing-transforms.html. * MotionMark/tests/dom/focus.html: Renamed from PerformanceTests/Animometer/tests/dom/focus.html. * MotionMark/tests/dom/leaves.html: Renamed from PerformanceTests/Animometer/tests/dom/leaves.html. * MotionMark/tests/dom/particles.html: Renamed from PerformanceTests/Animometer/tests/dom/particles.html. * MotionMark/tests/dom/resources/compositing-transforms.js: Renamed from PerformanceTests/Animometer/tests/dom/resources/compositing-transforms.js. * MotionMark/tests/dom/resources/dom-particles.js: Renamed from PerformanceTests/Animometer/tests/dom/resources/dom-particles.js. * MotionMark/tests/dom/resources/focus.js: Renamed from PerformanceTests/Animometer/tests/dom/resources/focus.js. * MotionMark/tests/dom/resources/leaves.js: Renamed from PerformanceTests/Animometer/tests/dom/resources/leaves.js. * MotionMark/tests/master/canvas-stage.html: Renamed from PerformanceTests/Animometer/tests/master/canvas-stage.html. * MotionMark/tests/master/focus.html: Renamed from PerformanceTests/Animometer/tests/master/focus.html. * MotionMark/tests/master/image-data.html: Renamed from PerformanceTests/Animometer/tests/master/image-data.html. * MotionMark/tests/master/leaves.html: Renamed from PerformanceTests/Animometer/tests/master/leaves.html. * MotionMark/tests/master/multiply.html: Renamed from PerformanceTests/Animometer/tests/master/multiply.html. * MotionMark/tests/master/resources/canvas-stage.js: Renamed from PerformanceTests/Animometer/tests/master/resources/canvas-stage.js. * MotionMark/tests/master/resources/canvas-tests.js: Renamed from PerformanceTests/Animometer/tests/master/resources/canvas-tests.js. * MotionMark/tests/master/resources/compass.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/compass.svg. * MotionMark/tests/master/resources/compass100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/compass100.png. * MotionMark/tests/master/resources/console.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/console.svg. * MotionMark/tests/master/resources/console100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/console100.png. * MotionMark/tests/master/resources/contribute.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/contribute.svg. * MotionMark/tests/master/resources/contribute100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/contribute100.png. * MotionMark/tests/master/resources/debugger.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/debugger.svg. * MotionMark/tests/master/resources/debugger100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/debugger100.png. * MotionMark/tests/master/resources/focus.js: Renamed from PerformanceTests/Animometer/tests/master/resources/focus.js. * MotionMark/tests/master/resources/image-data.js: Renamed from PerformanceTests/Animometer/tests/master/resources/image-data.js. * MotionMark/tests/master/resources/inspector.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/inspector.svg. * MotionMark/tests/master/resources/inspector100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/inspector100.png. * MotionMark/tests/master/resources/layout.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/layout.svg. * MotionMark/tests/master/resources/layout100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/layout100.png. * MotionMark/tests/master/resources/leaves.js: Renamed from PerformanceTests/Animometer/tests/master/resources/leaves.js. * MotionMark/tests/master/resources/multiply.js: Renamed from PerformanceTests/Animometer/tests/master/resources/multiply.js. * MotionMark/tests/master/resources/particles.js: Renamed from PerformanceTests/Animometer/tests/master/resources/particles.js. * MotionMark/tests/master/resources/performance.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/performance.svg. * MotionMark/tests/master/resources/performance100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/performance100.png. * MotionMark/tests/master/resources/script.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/script.svg. * MotionMark/tests/master/resources/script100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/script100.png. * MotionMark/tests/master/resources/shortcuts.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/shortcuts.svg. * MotionMark/tests/master/resources/shortcuts100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/shortcuts100.png. * MotionMark/tests/master/resources/standards.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/standards.svg. * MotionMark/tests/master/resources/standards100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/standards100.png. * MotionMark/tests/master/resources/storage.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/storage.svg. * MotionMark/tests/master/resources/storage100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/storage100.png. * MotionMark/tests/master/resources/styles.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/styles.svg. * MotionMark/tests/master/resources/styles100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/styles100.png. * MotionMark/tests/master/resources/svg-particles.js: Renamed from PerformanceTests/Animometer/tests/master/resources/svg-particles.js. * MotionMark/tests/master/resources/text.js: Renamed from PerformanceTests/Animometer/tests/master/resources/text.js. * MotionMark/tests/master/resources/timeline.svg: Renamed from PerformanceTests/Animometer/tests/master/resources/timeline.svg. * MotionMark/tests/master/resources/timeline100.png: Renamed from PerformanceTests/Animometer/tests/master/resources/timeline100.png. * MotionMark/tests/master/svg-particles.html: Renamed from PerformanceTests/Animometer/tests/master/svg-particles.html. * MotionMark/tests/master/text.html: Renamed from PerformanceTests/Animometer/tests/master/text.html. * MotionMark/tests/resources/main.js: Renamed from PerformanceTests/Animometer/tests/resources/main.js. * MotionMark/tests/resources/math.js: Renamed from PerformanceTests/Animometer/tests/resources/math.js. * MotionMark/tests/resources/stage.css: Renamed from PerformanceTests/Animometer/tests/resources/stage.css. * MotionMark/tests/resources/star.svg: Renamed from PerformanceTests/Animometer/tests/resources/star.svg. * MotionMark/tests/resources/yin-yang.png: Renamed from PerformanceTests/Animometer/tests/resources/yin-yang.png. * MotionMark/tests/resources/yin-yang.svg: Renamed from PerformanceTests/Animometer/tests/resources/yin-yang.svg. * MotionMark/tests/simple/resources/simple-canvas-paths.js: Renamed from PerformanceTests/Animometer/tests/simple/resources/simple-canvas-paths.js. * MotionMark/tests/simple/resources/simple-canvas.js: Renamed from PerformanceTests/Animometer/tests/simple/resources/simple-canvas.js. * MotionMark/tests/simple/resources/tiled-canvas-image.js: Renamed from PerformanceTests/Animometer/tests/simple/resources/tiled-canvas-image.js. * MotionMark/tests/simple/simple-canvas-paths.html: Renamed from PerformanceTests/Animometer/tests/simple/simple-canvas-paths.html. * MotionMark/tests/simple/tiled-canvas-image.html: Renamed from PerformanceTests/Animometer/tests/simple/tiled-canvas-image.html. * MotionMark/tests/template/resources/template-canvas.js: Renamed from PerformanceTests/Animometer/tests/template/resources/template-canvas.js. * MotionMark/tests/template/resources/template-css.js: Renamed from PerformanceTests/Animometer/tests/template/resources/template-css.js. * MotionMark/tests/template/resources/template-svg.js: Renamed from PerformanceTests/Animometer/tests/template/resources/template-svg.js. * MotionMark/tests/template/template-canvas.html: Renamed from PerformanceTests/Animometer/tests/template/template-canvas.html. * MotionMark/tests/template/template-css.html: Renamed from PerformanceTests/Animometer/tests/template/template-css.html. * MotionMark/tests/template/template-svg.html: Renamed from PerformanceTests/Animometer/tests/template/template-svg.html. Canonical link: https://commits.webkit.org/183912@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@210459 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2017-01-06 23:38:32 +00:00
<script src="resources/runner/motionmark.js"></script>
<script src="resources/debug-runner/motionmark.js" charset="utf-8"></script>
Split benchmark into two different pages https://bugs.webkit.org/show_bug.cgi?id=152458 Reviewed by Simon Fraser. Address comments. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): When the testing is complete the frame owning the sampler goes away, and a later call to get the JSON data is no longer available. Process the data right here, instead, and just reference it when displaying the results in ResultsDashboard.prototype._processData. * Animometer/resources/extensions.js: (Array.prototype.fill.Array.prototype.fill): Add a null check. Remove braces around single-line clause. (Array.prototype.find.Array.prototype.find): Update the null check. (ResultsDashboard.prototype._processData): Use the already-processed data. * Animometer/resources/runner/animometer.css: (.frame-container > iframe): Remove calc(). Move Array functions to extensions.js since that is included by the harness. Add ES6 Array polyfills. * Animometer/resources/algorithm.js: (Array.prototype.swap): Moved to extensions.js. * Animometer/resources/extensions.js: (Array.prototype.swap): (Array.prototype.fill): Added. (Array.prototype.find): Added. Adjust styles for iPad. * Animometer/resources/runner/animometer.css: (@media screen and (min-device-width: 768px)): Apply to iPad as well. (@media screen and (max-device-width: 1024px)): Update width for iPads. Adjustment styles for iOS. * Animometer/developer.html: Different divs contain the iframe, so use a class instead and update the style rules. * Animometer/index.html: * Animometer/resources/debug-runner/animometer.css: Remove extraneous rules. (@media screen and (min-device-width: 1800px)): Move this up. * Animometer/resources/runner/animometer.css: Add rules to accomodate iOS. Get rid of prefixed flex properties for now. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/runner/animometer.css: Update the structure of the harness. Remove the JSON-per-test but keep the JSON of the whole test run. Use the full page in order to display the graph. * Animometer/developer.html: Update several of the JS file includes to UTF-8. Remove header and footer. Test results screen includes score, average, and worst 5% statistics. * Animometer/index.html: Make structure similar to developer.html. * Animometer/resources/debug-runner/animometer.css: Remove most of the button rules since they are superfluous. Move the progress bar to the top, fixed. Update the results page rules. * Animometer/resources/debug-runner/animometer.js: Remove most of the additions to sectionsManager since they are no longer needed. (setSectionHeader): Updates header of the section. (window.suitesManager._updateStartButtonState): Update selector. (showResults): Add keypress event for selecting different data for copy/paste. Update how the results are populated. Include full test JSON in a textarea, rather than requiring a button press. (showTestGraph): * Animometer/resources/debug-runner/tests.js: Update structure of Headers. Define different kinds of headers. Headers can control their title, and the text used as the cell contents, including class name. * Animometer/resources/extensions.js: (ResultsTable): Update to include a flattened version of the headers, used while populating table contents. Remove unneeded helper functions for creating the table. Rename "show" to "add". * Animometer/resources/runner/animometer.css: Update rules to accommodate the new structure. * Animometer/resources/runner/animometer.js: (window.sectionsManager.setSectionScore): Helper function to set the score and mean for a section. (window.sectionsManager.populateTable): Helper function to set the table. (window.benchmarkController.showResults): Refactor. (window.benchmarkController.selectResults): Update selectors. * Animometer/resources/runner/tests.js: Set Headers. Debug harness extends it. Update debug runner to have similar names to the basic runner. Include that page's CSS and remove extraneous CSS rules. Get rid of the statistics table #record. * Animometer/developer.html: Rename #home to #intro. Rename .spacer to hr. * Animometer/resources/debug-runner/animometer.css: Set to flexbox when selected. * Animometer/resources/debug-runner/animometer.js: Remove recordTable. (window.suitesManager._updateStartButtonState): Update selector to #intro. (setupRunningSectionStyle): Deleted. * Animometer/resources/runner/animometer.css: (#test-container.selected): Change to flex-box only when visible. Remove recordTable. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): * Animometer/resources/runner/tests.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * Animometer/tests/examples/resources/canvas-electrons.js: * Animometer/tests/examples/resources/canvas-stars.js: * Animometer/tests/misc/resources/compositing-transforms.js: * Animometer/tests/resources/main.js: * Animometer/tests/resources/stage.js: (StageBenchmark): Remove _recordTable. * Animometer/tests/simple/resources/simple-canvas-paths.js: * Animometer/tests/simple/resources/simple-canvas.js: * Animometer/tests/template/resources/template-canvas.js: * Animometer/tests/template/resources/template-css.js: * Animometer/tests/template/resources/template-svg.js: * Animometer/tests/text/resources/layering-text.js: * Animometer/resources/debug-runner/animometer.js: (willStartFirstIteration): Fix selector, since results-table is used in multiple places, so it cannot be an id. Make it possible to select the scores, or the whole table data, by cycling through different selections through key press of 's'. * Animometer/resources/runner/animometer.js: (window.benchmarkController.showResults): Attach a keypress handler if it hasn't been added already. (window.benchmarkController.selectResults): * Animometer/resources/runner/tests.js: Cycle through different ranges. Fix a few fly-by errors. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunnerState.prototype.prepareCurrentTest): Update the frame relative path since the files are now in the top directory instead of inside runner/. (BenchmarkRunner.prototype._runTestAndRecordResults): Incorrect reference to function. (BenchmarkRunner.prototype.step): Member variable is never used. A little stylistic cleanup. * Animometer/resources/debug-runner/benchmark-runner.js: * Animometer/resources/extensions.js: (window.DocumentExtension.createElement): * Animometer/tests/resources/main.js: (Benchmark.prototype.record): * Animometer/tests/resources/stage.js: (StageBenchmark.prototype.showResults): Reverse progress and message. The message appears less frequently than the progress. * Animometer/tests/simple/resources/simple-canvas.js: (SimpleCanvasBenchmark): Remove unused options. Add newer version of harness in a new page. Consolidate differences between the two pages. * Animometer/developer.html: Include runner/animometer.js. Rename the JS function to run the benchmark to startBenchmark() instead of startTest(). Rename #running to #test-container. * Animometer/index.html: Added. Similarly calls startBenchmark() and has #test-container. * Animometer/resources/debug-runner/animometer.css: Make the canvas 2:1 (1200px x 800px) instead of 4:3. Split out benchmarkRunnerClient and benchmarkController. * Animometer/resources/debug-runner/animometer.js: Move needed functions out of benchmarkRunnerClient, and leave the rest here to extend that object. Get rid of _resultsTable and move populating the results table into benchmarkController. Rename _resultsDashboard to results and make it accessible for other objects to use. (willAddTestFrame): This is unnecessary. Remove. (window.sectionsManager.showScore): Grab it from the results object instead of benchmarkRunnerClient. (window.sectionsManager.showSection): Deleted. Moved to runner/animometer.js. (window.benchmarkController._runBenchmark): Deleted. Mostly moved into _startBenchmark. (window.benchmarkController.startBenchmark): Refactor to call _startBenchmark. (window.benchmarkController.showResults): Include most of benchmarkRunnerClient.didFinishLastIteration() here. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._appendFrame): Remove unneeded call to willAddTestFrame. * Animometer/resources/extensions.js: (ResultsDashboard): Change the class to process the sampler data on-demand and hold onto that data for later referencing. (ResultsDashboard.prototype.toJSON): Deleted. (ResultsDashboard.prototype._processData): Rename toJSON to _processData since it's not really outputting JSON. Store the processed data into a member variable that can be referenced later. (ResultsDashboard.prototype.get data): Process the data if it hasn't already. (ResultsDashboard.prototype.get score): Process the data if it hasn't already, then return the aggregate score. (ResultsTable.prototype._showHeader): When outputting the results to a table, don't force the need for an empty children array. This was to allow for a header row in the table that spanned multiple columns. In the simpler harness, this is not needed. (ResultsTable.prototype._showEmptyCells): (ResultsTable.prototype._showTest): This hardcoded the columns. At least for the name and score, which is the bare minimum needed for the simpler harness, key off of the header name provided. * Animometer/resources/runner/animometer.css: Added. Use a similar 2:1 ratio. The score tables are split into the data and the headers, and are also displayed RTL so that a later patch allows a user to copy-paste the data easily. * Animometer/resources/runner/animometer.js: Added. Use a simpler version of benchmarkRunnerClient. The debug harness will extend these classes. (window.benchmarkController._startBenchmark): Used by both harnesses. (window.benchmarkController.startBenchmark): Set hard-coded options. (window.benchmarkController.showResults): Includes most of benchmarkRunnerClient.didFinishLastIteration() here. Get rid of utilities.js. Move it all into extensions.js. * Animometer/resources/extensions.js: * Animometer/tests/resources/utilities.js: Removed. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: Remove script link. * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: Ditto. * Animometer/tests/examples/canvas-electrons.html: Ditto. * Animometer/tests/examples/canvas-stars.html: Ditto. * Animometer/tests/misc/compositing-transforms.html: Ditto. * Animometer/tests/simple/simple-canvas-paths.html: Ditto. * Animometer/tests/template/template-canvas.html: Ditto. * Animometer/tests/template/template-css.html: Ditto. * Animometer/tests/template/template-svg.html: Ditto. * Animometer/tests/text/layering-text.html: Ditto. Split tests.js into two. Add a new suite to runner/tests.js. * Animometer/developer.html: Update the script order. Scripts from debug-runner/ will always build on those from runner/, and have the same name. * Animometer/resources/debug-runner/tests.js: Move "complex examples" suite into "miscellaneous tests". (Suite): Deleted. (Suite.prototype.prepare): Deleted. (Suite.prototype.run): Deleted. (suiteFromName): Deleted. * Animometer/resources/runner/tests.js: Added. Take definitions and functions needed by the test harness. Leave the test suites behind. (Suite): Moved from debug script. (Suite.prototype.prepare): Ditto. (Suite.prototype.run): Ditto. (suiteFromName): Ditto. (testFromName): Ditto. Move benchmark resources out into resources/debug-runner, and update URLs. * Animometer/developer.html: Renamed from PerformanceTests/Animometer/runner/animometer.html. * Animometer/resources/debug-runner/animometer.css: Renamed from PerformanceTests/Animometer/runner/resources/animometer.css. * Animometer/resources/debug-runner/animometer.js: Renamed from PerformanceTests/Animometer/runner/resources/animometer.js. * Animometer/resources/debug-runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/runner/resources/benchmark-runner.js. * Animometer/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/runner/resources/d3.min.js. * Animometer/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/runner/resources/graph.js. * Animometer/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/runner/resources/tests.js. Canonical link: https://commits.webkit.org/170645@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@194407 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2015-12-24 01:31:47 +00:00
Update benchmark test suite https://bugs.webkit.org/show_bug.cgi?id=152679 Reviewed by Simon Fraser. Move algorithm.js and sampler.js to tests/ and benchmark-runner.js to runner/. Needed by both harnesses. * Animometer/resources/runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/resources/debug-runner/benchmark-runner.js. * Animometer/developer.html: * Animometer/index.html: Needed only by the tests. Move to tests/. Statistics, in sampler.js, is used by ResultsDashboard, so move that into extensions.js. * Animometer/resources/extensions.js: * Animometer/tests/resources/algorithm.js: Renamed from PerformanceTests/Animometer/resources/algorithm.js. * Animometer/tests/resources/sampler.js: Renamed from PerformanceTests/Animometer/resources/sampler.js. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: * Animometer/tests/bouncing-particles/bouncing-css-images.html: * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: * Animometer/tests/bouncing-particles/bouncing-svg-images.html: * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: * Animometer/tests/master/canvas-stage.html: * Animometer/tests/misc/canvas-electrons.html: * Animometer/tests/misc/canvas-stars.html: * Animometer/tests/misc/compositing-transforms.html: * Animometer/tests/simple/simple-canvas-paths.html: * Animometer/tests/template/template-canvas.html: * Animometer/tests/template/template-css.html: * Animometer/tests/template/template-svg.html: * Animometer/tests/text/layering-text.html: * Animometer/tests/text/text-boxes.html: Canonical link: https://commits.webkit.org/170969@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@194753 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-01-08 03:58:33 +00:00
<script src="resources/runner/benchmark-runner.js"></script>
Split benchmark into two different pages https://bugs.webkit.org/show_bug.cgi?id=152458 Reviewed by Simon Fraser. Address comments. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): When the testing is complete the frame owning the sampler goes away, and a later call to get the JSON data is no longer available. Process the data right here, instead, and just reference it when displaying the results in ResultsDashboard.prototype._processData. * Animometer/resources/extensions.js: (Array.prototype.fill.Array.prototype.fill): Add a null check. Remove braces around single-line clause. (Array.prototype.find.Array.prototype.find): Update the null check. (ResultsDashboard.prototype._processData): Use the already-processed data. * Animometer/resources/runner/animometer.css: (.frame-container > iframe): Remove calc(). Move Array functions to extensions.js since that is included by the harness. Add ES6 Array polyfills. * Animometer/resources/algorithm.js: (Array.prototype.swap): Moved to extensions.js. * Animometer/resources/extensions.js: (Array.prototype.swap): (Array.prototype.fill): Added. (Array.prototype.find): Added. Adjust styles for iPad. * Animometer/resources/runner/animometer.css: (@media screen and (min-device-width: 768px)): Apply to iPad as well. (@media screen and (max-device-width: 1024px)): Update width for iPads. Adjustment styles for iOS. * Animometer/developer.html: Different divs contain the iframe, so use a class instead and update the style rules. * Animometer/index.html: * Animometer/resources/debug-runner/animometer.css: Remove extraneous rules. (@media screen and (min-device-width: 1800px)): Move this up. * Animometer/resources/runner/animometer.css: Add rules to accomodate iOS. Get rid of prefixed flex properties for now. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/runner/animometer.css: Update the structure of the harness. Remove the JSON-per-test but keep the JSON of the whole test run. Use the full page in order to display the graph. * Animometer/developer.html: Update several of the JS file includes to UTF-8. Remove header and footer. Test results screen includes score, average, and worst 5% statistics. * Animometer/index.html: Make structure similar to developer.html. * Animometer/resources/debug-runner/animometer.css: Remove most of the button rules since they are superfluous. Move the progress bar to the top, fixed. Update the results page rules. * Animometer/resources/debug-runner/animometer.js: Remove most of the additions to sectionsManager since they are no longer needed. (setSectionHeader): Updates header of the section. (window.suitesManager._updateStartButtonState): Update selector. (showResults): Add keypress event for selecting different data for copy/paste. Update how the results are populated. Include full test JSON in a textarea, rather than requiring a button press. (showTestGraph): * Animometer/resources/debug-runner/tests.js: Update structure of Headers. Define different kinds of headers. Headers can control their title, and the text used as the cell contents, including class name. * Animometer/resources/extensions.js: (ResultsTable): Update to include a flattened version of the headers, used while populating table contents. Remove unneeded helper functions for creating the table. Rename "show" to "add". * Animometer/resources/runner/animometer.css: Update rules to accommodate the new structure. * Animometer/resources/runner/animometer.js: (window.sectionsManager.setSectionScore): Helper function to set the score and mean for a section. (window.sectionsManager.populateTable): Helper function to set the table. (window.benchmarkController.showResults): Refactor. (window.benchmarkController.selectResults): Update selectors. * Animometer/resources/runner/tests.js: Set Headers. Debug harness extends it. Update debug runner to have similar names to the basic runner. Include that page's CSS and remove extraneous CSS rules. Get rid of the statistics table #record. * Animometer/developer.html: Rename #home to #intro. Rename .spacer to hr. * Animometer/resources/debug-runner/animometer.css: Set to flexbox when selected. * Animometer/resources/debug-runner/animometer.js: Remove recordTable. (window.suitesManager._updateStartButtonState): Update selector to #intro. (setupRunningSectionStyle): Deleted. * Animometer/resources/runner/animometer.css: (#test-container.selected): Change to flex-box only when visible. Remove recordTable. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): * Animometer/resources/runner/tests.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * Animometer/tests/examples/resources/canvas-electrons.js: * Animometer/tests/examples/resources/canvas-stars.js: * Animometer/tests/misc/resources/compositing-transforms.js: * Animometer/tests/resources/main.js: * Animometer/tests/resources/stage.js: (StageBenchmark): Remove _recordTable. * Animometer/tests/simple/resources/simple-canvas-paths.js: * Animometer/tests/simple/resources/simple-canvas.js: * Animometer/tests/template/resources/template-canvas.js: * Animometer/tests/template/resources/template-css.js: * Animometer/tests/template/resources/template-svg.js: * Animometer/tests/text/resources/layering-text.js: * Animometer/resources/debug-runner/animometer.js: (willStartFirstIteration): Fix selector, since results-table is used in multiple places, so it cannot be an id. Make it possible to select the scores, or the whole table data, by cycling through different selections through key press of 's'. * Animometer/resources/runner/animometer.js: (window.benchmarkController.showResults): Attach a keypress handler if it hasn't been added already. (window.benchmarkController.selectResults): * Animometer/resources/runner/tests.js: Cycle through different ranges. Fix a few fly-by errors. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunnerState.prototype.prepareCurrentTest): Update the frame relative path since the files are now in the top directory instead of inside runner/. (BenchmarkRunner.prototype._runTestAndRecordResults): Incorrect reference to function. (BenchmarkRunner.prototype.step): Member variable is never used. A little stylistic cleanup. * Animometer/resources/debug-runner/benchmark-runner.js: * Animometer/resources/extensions.js: (window.DocumentExtension.createElement): * Animometer/tests/resources/main.js: (Benchmark.prototype.record): * Animometer/tests/resources/stage.js: (StageBenchmark.prototype.showResults): Reverse progress and message. The message appears less frequently than the progress. * Animometer/tests/simple/resources/simple-canvas.js: (SimpleCanvasBenchmark): Remove unused options. Add newer version of harness in a new page. Consolidate differences between the two pages. * Animometer/developer.html: Include runner/animometer.js. Rename the JS function to run the benchmark to startBenchmark() instead of startTest(). Rename #running to #test-container. * Animometer/index.html: Added. Similarly calls startBenchmark() and has #test-container. * Animometer/resources/debug-runner/animometer.css: Make the canvas 2:1 (1200px x 800px) instead of 4:3. Split out benchmarkRunnerClient and benchmarkController. * Animometer/resources/debug-runner/animometer.js: Move needed functions out of benchmarkRunnerClient, and leave the rest here to extend that object. Get rid of _resultsTable and move populating the results table into benchmarkController. Rename _resultsDashboard to results and make it accessible for other objects to use. (willAddTestFrame): This is unnecessary. Remove. (window.sectionsManager.showScore): Grab it from the results object instead of benchmarkRunnerClient. (window.sectionsManager.showSection): Deleted. Moved to runner/animometer.js. (window.benchmarkController._runBenchmark): Deleted. Mostly moved into _startBenchmark. (window.benchmarkController.startBenchmark): Refactor to call _startBenchmark. (window.benchmarkController.showResults): Include most of benchmarkRunnerClient.didFinishLastIteration() here. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._appendFrame): Remove unneeded call to willAddTestFrame. * Animometer/resources/extensions.js: (ResultsDashboard): Change the class to process the sampler data on-demand and hold onto that data for later referencing. (ResultsDashboard.prototype.toJSON): Deleted. (ResultsDashboard.prototype._processData): Rename toJSON to _processData since it's not really outputting JSON. Store the processed data into a member variable that can be referenced later. (ResultsDashboard.prototype.get data): Process the data if it hasn't already. (ResultsDashboard.prototype.get score): Process the data if it hasn't already, then return the aggregate score. (ResultsTable.prototype._showHeader): When outputting the results to a table, don't force the need for an empty children array. This was to allow for a header row in the table that spanned multiple columns. In the simpler harness, this is not needed. (ResultsTable.prototype._showEmptyCells): (ResultsTable.prototype._showTest): This hardcoded the columns. At least for the name and score, which is the bare minimum needed for the simpler harness, key off of the header name provided. * Animometer/resources/runner/animometer.css: Added. Use a similar 2:1 ratio. The score tables are split into the data and the headers, and are also displayed RTL so that a later patch allows a user to copy-paste the data easily. * Animometer/resources/runner/animometer.js: Added. Use a simpler version of benchmarkRunnerClient. The debug harness will extend these classes. (window.benchmarkController._startBenchmark): Used by both harnesses. (window.benchmarkController.startBenchmark): Set hard-coded options. (window.benchmarkController.showResults): Includes most of benchmarkRunnerClient.didFinishLastIteration() here. Get rid of utilities.js. Move it all into extensions.js. * Animometer/resources/extensions.js: * Animometer/tests/resources/utilities.js: Removed. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: Remove script link. * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: Ditto. * Animometer/tests/examples/canvas-electrons.html: Ditto. * Animometer/tests/examples/canvas-stars.html: Ditto. * Animometer/tests/misc/compositing-transforms.html: Ditto. * Animometer/tests/simple/simple-canvas-paths.html: Ditto. * Animometer/tests/template/template-canvas.html: Ditto. * Animometer/tests/template/template-css.html: Ditto. * Animometer/tests/template/template-svg.html: Ditto. * Animometer/tests/text/layering-text.html: Ditto. Split tests.js into two. Add a new suite to runner/tests.js. * Animometer/developer.html: Update the script order. Scripts from debug-runner/ will always build on those from runner/, and have the same name. * Animometer/resources/debug-runner/tests.js: Move "complex examples" suite into "miscellaneous tests". (Suite): Deleted. (Suite.prototype.prepare): Deleted. (Suite.prototype.run): Deleted. (suiteFromName): Deleted. * Animometer/resources/runner/tests.js: Added. Take definitions and functions needed by the test harness. Leave the test suites behind. (Suite): Moved from debug script. (Suite.prototype.prepare): Ditto. (Suite.prototype.run): Ditto. (suiteFromName): Ditto. (testFromName): Ditto. Move benchmark resources out into resources/debug-runner, and update URLs. * Animometer/developer.html: Renamed from PerformanceTests/Animometer/runner/animometer.html. * Animometer/resources/debug-runner/animometer.css: Renamed from PerformanceTests/Animometer/runner/resources/animometer.css. * Animometer/resources/debug-runner/animometer.js: Renamed from PerformanceTests/Animometer/runner/resources/animometer.js. * Animometer/resources/debug-runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/runner/resources/benchmark-runner.js. * Animometer/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/runner/resources/d3.min.js. * Animometer/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/runner/resources/graph.js. * Animometer/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/runner/resources/tests.js. Canonical link: https://commits.webkit.org/170645@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@194407 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2015-12-24 01:31:47 +00:00
<script src="resources/debug-runner/d3.min.js"></script>
<script src="resources/debug-runner/graph.js" charset="utf-8"></script>
Split benchmark into two different pages https://bugs.webkit.org/show_bug.cgi?id=152458 Reviewed by Simon Fraser. Address comments. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): When the testing is complete the frame owning the sampler goes away, and a later call to get the JSON data is no longer available. Process the data right here, instead, and just reference it when displaying the results in ResultsDashboard.prototype._processData. * Animometer/resources/extensions.js: (Array.prototype.fill.Array.prototype.fill): Add a null check. Remove braces around single-line clause. (Array.prototype.find.Array.prototype.find): Update the null check. (ResultsDashboard.prototype._processData): Use the already-processed data. * Animometer/resources/runner/animometer.css: (.frame-container > iframe): Remove calc(). Move Array functions to extensions.js since that is included by the harness. Add ES6 Array polyfills. * Animometer/resources/algorithm.js: (Array.prototype.swap): Moved to extensions.js. * Animometer/resources/extensions.js: (Array.prototype.swap): (Array.prototype.fill): Added. (Array.prototype.find): Added. Adjust styles for iPad. * Animometer/resources/runner/animometer.css: (@media screen and (min-device-width: 768px)): Apply to iPad as well. (@media screen and (max-device-width: 1024px)): Update width for iPads. Adjustment styles for iOS. * Animometer/developer.html: Different divs contain the iframe, so use a class instead and update the style rules. * Animometer/index.html: * Animometer/resources/debug-runner/animometer.css: Remove extraneous rules. (@media screen and (min-device-width: 1800px)): Move this up. * Animometer/resources/runner/animometer.css: Add rules to accomodate iOS. Get rid of prefixed flex properties for now. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/runner/animometer.css: Update the structure of the harness. Remove the JSON-per-test but keep the JSON of the whole test run. Use the full page in order to display the graph. * Animometer/developer.html: Update several of the JS file includes to UTF-8. Remove header and footer. Test results screen includes score, average, and worst 5% statistics. * Animometer/index.html: Make structure similar to developer.html. * Animometer/resources/debug-runner/animometer.css: Remove most of the button rules since they are superfluous. Move the progress bar to the top, fixed. Update the results page rules. * Animometer/resources/debug-runner/animometer.js: Remove most of the additions to sectionsManager since they are no longer needed. (setSectionHeader): Updates header of the section. (window.suitesManager._updateStartButtonState): Update selector. (showResults): Add keypress event for selecting different data for copy/paste. Update how the results are populated. Include full test JSON in a textarea, rather than requiring a button press. (showTestGraph): * Animometer/resources/debug-runner/tests.js: Update structure of Headers. Define different kinds of headers. Headers can control their title, and the text used as the cell contents, including class name. * Animometer/resources/extensions.js: (ResultsTable): Update to include a flattened version of the headers, used while populating table contents. Remove unneeded helper functions for creating the table. Rename "show" to "add". * Animometer/resources/runner/animometer.css: Update rules to accommodate the new structure. * Animometer/resources/runner/animometer.js: (window.sectionsManager.setSectionScore): Helper function to set the score and mean for a section. (window.sectionsManager.populateTable): Helper function to set the table. (window.benchmarkController.showResults): Refactor. (window.benchmarkController.selectResults): Update selectors. * Animometer/resources/runner/tests.js: Set Headers. Debug harness extends it. Update debug runner to have similar names to the basic runner. Include that page's CSS and remove extraneous CSS rules. Get rid of the statistics table #record. * Animometer/developer.html: Rename #home to #intro. Rename .spacer to hr. * Animometer/resources/debug-runner/animometer.css: Set to flexbox when selected. * Animometer/resources/debug-runner/animometer.js: Remove recordTable. (window.suitesManager._updateStartButtonState): Update selector to #intro. (setupRunningSectionStyle): Deleted. * Animometer/resources/runner/animometer.css: (#test-container.selected): Change to flex-box only when visible. Remove recordTable. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): * Animometer/resources/runner/tests.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * Animometer/tests/examples/resources/canvas-electrons.js: * Animometer/tests/examples/resources/canvas-stars.js: * Animometer/tests/misc/resources/compositing-transforms.js: * Animometer/tests/resources/main.js: * Animometer/tests/resources/stage.js: (StageBenchmark): Remove _recordTable. * Animometer/tests/simple/resources/simple-canvas-paths.js: * Animometer/tests/simple/resources/simple-canvas.js: * Animometer/tests/template/resources/template-canvas.js: * Animometer/tests/template/resources/template-css.js: * Animometer/tests/template/resources/template-svg.js: * Animometer/tests/text/resources/layering-text.js: * Animometer/resources/debug-runner/animometer.js: (willStartFirstIteration): Fix selector, since results-table is used in multiple places, so it cannot be an id. Make it possible to select the scores, or the whole table data, by cycling through different selections through key press of 's'. * Animometer/resources/runner/animometer.js: (window.benchmarkController.showResults): Attach a keypress handler if it hasn't been added already. (window.benchmarkController.selectResults): * Animometer/resources/runner/tests.js: Cycle through different ranges. Fix a few fly-by errors. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunnerState.prototype.prepareCurrentTest): Update the frame relative path since the files are now in the top directory instead of inside runner/. (BenchmarkRunner.prototype._runTestAndRecordResults): Incorrect reference to function. (BenchmarkRunner.prototype.step): Member variable is never used. A little stylistic cleanup. * Animometer/resources/debug-runner/benchmark-runner.js: * Animometer/resources/extensions.js: (window.DocumentExtension.createElement): * Animometer/tests/resources/main.js: (Benchmark.prototype.record): * Animometer/tests/resources/stage.js: (StageBenchmark.prototype.showResults): Reverse progress and message. The message appears less frequently than the progress. * Animometer/tests/simple/resources/simple-canvas.js: (SimpleCanvasBenchmark): Remove unused options. Add newer version of harness in a new page. Consolidate differences between the two pages. * Animometer/developer.html: Include runner/animometer.js. Rename the JS function to run the benchmark to startBenchmark() instead of startTest(). Rename #running to #test-container. * Animometer/index.html: Added. Similarly calls startBenchmark() and has #test-container. * Animometer/resources/debug-runner/animometer.css: Make the canvas 2:1 (1200px x 800px) instead of 4:3. Split out benchmarkRunnerClient and benchmarkController. * Animometer/resources/debug-runner/animometer.js: Move needed functions out of benchmarkRunnerClient, and leave the rest here to extend that object. Get rid of _resultsTable and move populating the results table into benchmarkController. Rename _resultsDashboard to results and make it accessible for other objects to use. (willAddTestFrame): This is unnecessary. Remove. (window.sectionsManager.showScore): Grab it from the results object instead of benchmarkRunnerClient. (window.sectionsManager.showSection): Deleted. Moved to runner/animometer.js. (window.benchmarkController._runBenchmark): Deleted. Mostly moved into _startBenchmark. (window.benchmarkController.startBenchmark): Refactor to call _startBenchmark. (window.benchmarkController.showResults): Include most of benchmarkRunnerClient.didFinishLastIteration() here. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._appendFrame): Remove unneeded call to willAddTestFrame. * Animometer/resources/extensions.js: (ResultsDashboard): Change the class to process the sampler data on-demand and hold onto that data for later referencing. (ResultsDashboard.prototype.toJSON): Deleted. (ResultsDashboard.prototype._processData): Rename toJSON to _processData since it's not really outputting JSON. Store the processed data into a member variable that can be referenced later. (ResultsDashboard.prototype.get data): Process the data if it hasn't already. (ResultsDashboard.prototype.get score): Process the data if it hasn't already, then return the aggregate score. (ResultsTable.prototype._showHeader): When outputting the results to a table, don't force the need for an empty children array. This was to allow for a header row in the table that spanned multiple columns. In the simpler harness, this is not needed. (ResultsTable.prototype._showEmptyCells): (ResultsTable.prototype._showTest): This hardcoded the columns. At least for the name and score, which is the bare minimum needed for the simpler harness, key off of the header name provided. * Animometer/resources/runner/animometer.css: Added. Use a similar 2:1 ratio. The score tables are split into the data and the headers, and are also displayed RTL so that a later patch allows a user to copy-paste the data easily. * Animometer/resources/runner/animometer.js: Added. Use a simpler version of benchmarkRunnerClient. The debug harness will extend these classes. (window.benchmarkController._startBenchmark): Used by both harnesses. (window.benchmarkController.startBenchmark): Set hard-coded options. (window.benchmarkController.showResults): Includes most of benchmarkRunnerClient.didFinishLastIteration() here. Get rid of utilities.js. Move it all into extensions.js. * Animometer/resources/extensions.js: * Animometer/tests/resources/utilities.js: Removed. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: Remove script link. * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: Ditto. * Animometer/tests/examples/canvas-electrons.html: Ditto. * Animometer/tests/examples/canvas-stars.html: Ditto. * Animometer/tests/misc/compositing-transforms.html: Ditto. * Animometer/tests/simple/simple-canvas-paths.html: Ditto. * Animometer/tests/template/template-canvas.html: Ditto. * Animometer/tests/template/template-css.html: Ditto. * Animometer/tests/template/template-svg.html: Ditto. * Animometer/tests/text/layering-text.html: Ditto. Split tests.js into two. Add a new suite to runner/tests.js. * Animometer/developer.html: Update the script order. Scripts from debug-runner/ will always build on those from runner/, and have the same name. * Animometer/resources/debug-runner/tests.js: Move "complex examples" suite into "miscellaneous tests". (Suite): Deleted. (Suite.prototype.prepare): Deleted. (Suite.prototype.run): Deleted. (suiteFromName): Deleted. * Animometer/resources/runner/tests.js: Added. Take definitions and functions needed by the test harness. Leave the test suites behind. (Suite): Moved from debug script. (Suite.prototype.prepare): Ditto. (Suite.prototype.run): Ditto. (suiteFromName): Ditto. (testFromName): Ditto. Move benchmark resources out into resources/debug-runner, and update URLs. * Animometer/developer.html: Renamed from PerformanceTests/Animometer/runner/animometer.html. * Animometer/resources/debug-runner/animometer.css: Renamed from PerformanceTests/Animometer/runner/resources/animometer.css. * Animometer/resources/debug-runner/animometer.js: Renamed from PerformanceTests/Animometer/runner/resources/animometer.js. * Animometer/resources/debug-runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/runner/resources/benchmark-runner.js. * Animometer/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/runner/resources/d3.min.js. * Animometer/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/runner/resources/graph.js. * Animometer/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/runner/resources/tests.js. Canonical link: https://commits.webkit.org/170645@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@194407 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2015-12-24 01:31:47 +00:00
</head>
Improvements to Animometer benchmark https://bugs.webkit.org/show_bug.cgi?id=157738 Reviewed by Dean Jackson. Provisionally reviewed by Said Abou-Hallawa. Include confidence interval for the final score, and store the canvas size in the serialization so that it is accurately shown in results. * Animometer/developer.html: Add a "confidence" div. * Animometer/index.html: Ditto. Convert "mean" to "confidence". * Animometer/resources/debug-runner/animometer.js: Look at options, and if the configuration is included, update the body class based on it (similar to what we do when we first load the page). That way, if you drag-and-drop previous results in, that configuration is reflected in the dashboard. Show the full confidence interval. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Style update. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: (_processData): Propagate the confidence interval values out and calculate the lower and upper bounds. For now, shortcut the aggregate calculation, since we only go through one iteration. (this._processData.calculateScore): Propagate the confidence interval out to be next to the score. Depending on the controller these values are calculated differently. (this._processData._getResultsProperty): Convenience function. (this._processData.get score): Refactor. (this._processData.get scoreLowerBound): Get the aggregate lower bound. (this._processData.get scoreUpperBound): Get the aggregate upper bound. (window.sectionsManager.setSectionScore): (window.benchmarkController._startBenchmark): When the benchmark starts, note the canvas size and add it to options. That way it will be automatically be serialized. (window.benchmarkController.showResults): Include the maximum deviation percentage. * Animometer/resources/runner/lines.svg: Make the background line up with the skew. * Animometer/resources/runner/tests.js: (Headers.details.text): Refactor. * Animometer/resources/statistics.js: (Statistics.largestDeviationPercentage): Convenience function to calculate the deviation percentage on either end and return the largest deviation. * Animometer/resources/strings.js: Allow specifying a regression profile to use instead of taking the one with the lowest error. Address an issue in the focus test when the regression calculated ends up overestimating the change point, causing a cascade of tougher ramps. The reason behind this is that at max complexity of an initial ramp, the frame length is very high, and it influences the second segment of the piecewise regression strongly, causing it to be very steep. As a result, the first segment, expected to be flat, ends up covering a higher range of complexity. That makes the change point much higher than it should be. To avoid this, we will add a sanity check on the maximum value of the ramp. If the regression's projected value at the maximum complexity of the current ramp is very slow (less than 20 fps), 1) reduce the maximum complexity by 20%, and 2) do not include the regression's change point in the change point estimator. That estimator is used as the midpoint of the next ramp, and including the change point from a poor regression can bake in the error. The controller already knows how to adjust for ramps that are too easy for the system. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Takes a preferred profile and gives that to Regression. (this._processData.calculateScore): With the ramp controller, take the profile of the ramp that was used the most when calculating the ramp's regression. That profile is what is used for the test's score. * Animometer/resources/statistics.js: (Regression.Utilities.createClass): Update to take an options object which can specify a profile to calculate with. Otherwise it will continue to use both and select the one with the lower error. (_calculateRegression): Fix an issue where we claim 0 error if the regression calculation fails due to divide-by-zero. Instead reject that regression calculation by giving it Number.MAX_VALUE. * Animometer/resources/strings.js: New strings for marking a regression as flat or slope. * Animometer/tests/resources/main.js: (RampController): Rename the thresholds for clarity. Add a threshold that, if exceeded, will lower the maximum complexity of the next ramp. (tune): Relax the cdf check to consider whether the interval definitely falls in the desired frame length threshold. (processSamples): Include the profile in the ramp. Update ramp controller test. Increase the length of the test to 30 seconds, and extend the interval to 120 ms during sampling. Improve the estimation of the ramp parameters. * Animometer/developer.html: Change default to 30 seconds, and don't show the progress bar by default. * Animometer/resources/runner/animometer.js: Change default to 30 seconds. * Animometer/tests/resources/main.js: A number of improvements to the ramp controller, in the order in which they appear in the patch: - With a longer test length use longer ramps with longer intervals to get more data at each complexity. Keep the 100 ms interval length during the ramp up phase since we don't need to spend more time there to find the right order of magnitude, but increase it during the ramps to 120 ms. - The ramp linearly interpolates the complexity to render based on its timestamp, but it would never sample the minimum complexity. Instead of lerping max to min complexity from time 0 to t where t is the ramp length, instead lerp from 0 to (t - intervalSampleLength) so that we can have at least one interval sample at the min complexity for that ramp. - Some regression calculations only come out with one line segment rather than the two we expect. This could be due to a noisy ramp or the ramp's range is too narrow. If that's the case, influence the minimum complexity of the next ramp towards the lowest bound of 1, so that we ensure that at least part of the ramp is covering a complexity range that the system can handle at full 60. - Remove an assignment to interpolatedFrameLength since that is never subsequently used. Update the format used to serialize the results for analysis. Each data point used to have named properties for fields like complexity and frame rate. In addition the serialized numbers had rounding errors that took up many characters. Update the format by introducing a new data container called SampleData, which contains a field map. The map maps a string to an array index. Each data point is an array, so, to get a stat, use the field map to get the array index into the data point. This allows future versions to track other data, and reduces the size of the output string by two-thirds. * Animometer/resources/extensions.js: (Utilities.toFixedNumber): Add convenience function that truncates the number to a fixed precision, and converts it back to a number. (SampleData): New class that contains sample data and a field map that maps properties to an array index. (get length): Number of data points. (addField): Add a field to the field map. (push): Add a data point. (sort): Sort the data. (slice): Return new SampleData object with sliced data. (forEach): Iterate over the data with provided function. (createDatum): (getFieldInDatum): Returns the data point associated with the field name by looking it up in the field map in the datum provided, which can be the datum object itself (an array) or an index into the data member variable. (setFieldInDatum): Sets the data point associated with the field name. (at): Returns the data point at the provided index. (toArray): Serializes the data where the field map serves as property names for each point. * Animometer/resources/debug-runner/graph.js: (updateGraphData): Remove unused _testData. Convert the data to the old array format for the graph to use, since the old format was much easier to work with when displaying the graphs. (onGraphTypeChanged): For some controllers, no alternative score or mean is provided. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: Refactor to use SampleData. Update JSON output to only go to 3 digits of precision for purposes of reducing the data size. * Animometer/resources/strings.js: Add new strings to put into the field maps. * Animometer/tests/resources/main.js: Refactor to use SampleData. * Animometer/developer.html: * Animometer/index.html: Restructure results table for both pages. Add charset attribute to tests.js include. * Animometer/resources/debug-runner/animometer.css: Clear out styles from release runner. * Animometer/resources/debug-runner/graph.js: (onGraphTypeChanged): Update score and mean if bootstrap results are available from the controller, since not all controllers do bootstrapping. * Animometer/resources/debug-runner/tests.js: Update header text. * Animometer/resources/runner/animometer.css: Include confidence interval in results. * Animometer/resources/runner/animometer.js: (ResultsTable._addHeader): Header contents can be HTML, so use innerHTML instead. (ResultsTable._addBody): Add tbody element. (ResultsTable._addTest): Allow a data cell to invoke a JS function to get its contents. (window.benchmarkController.showResults): Add table that includes tests' confidence intervals. * Animometer/resources/runner/tests.js: (Headers.details.text): Add new details table that includes bootstrap confidence interval. The interval can be asymmetric, but for simplicity, report the maximum deviation percentage on either side of the bootstrap median. * Animometer/resources/statistics.js: (bootstrap): Include the confidence percentage in the return object. Report canvas size in results. * Animometer/developer.html: Add markup to indicate whether a small, medium, or large canvas was used. * Animometer/index.html: Ditto. * Animometer/resources/debug-runner/animometer.js: Call determineCanvasSize(). * Animometer/resources/runner/animometer.css: Update styles to set the canvas based on the body class size. * Animometer/resources/runner/animometer.js: (window.benchmarkController.initialize): Update styles to set the canvas based on the body class size. (window.benchmarkController.determineCanvasSize): Run various media queries and set the body class based on the size of the device. * Animometer/developer.html: Refactor to include the main CSS file, and redo the layout so that it doesn't rely on flexbox. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: (updateDisplay): Since various parts of the script alter the body class, we can't replace the className directly. Instead, remove all display-based values and then add the one that was selected. * Animometer/resources/debug-runner/graph.js: (updateGraphData): To set the size of the graph, use window.innerHeight. * Animometer/resources/runner/animometer.js: (window.sectionsManager.showSection): Since various parts of the script alter the body class, we can't replace the className directly. Remove all of the section classes individually and then add the one desired. * Animometer/tests/resources/stage.css: Remove -apple-system as a font to use in the stage. Canonical link: https://commits.webkit.org/177081@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@202314 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-06-22 03:11:15 +00:00
<body class="showing-intro">
Split benchmark into two different pages https://bugs.webkit.org/show_bug.cgi?id=152458 Reviewed by Simon Fraser. Address comments. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): When the testing is complete the frame owning the sampler goes away, and a later call to get the JSON data is no longer available. Process the data right here, instead, and just reference it when displaying the results in ResultsDashboard.prototype._processData. * Animometer/resources/extensions.js: (Array.prototype.fill.Array.prototype.fill): Add a null check. Remove braces around single-line clause. (Array.prototype.find.Array.prototype.find): Update the null check. (ResultsDashboard.prototype._processData): Use the already-processed data. * Animometer/resources/runner/animometer.css: (.frame-container > iframe): Remove calc(). Move Array functions to extensions.js since that is included by the harness. Add ES6 Array polyfills. * Animometer/resources/algorithm.js: (Array.prototype.swap): Moved to extensions.js. * Animometer/resources/extensions.js: (Array.prototype.swap): (Array.prototype.fill): Added. (Array.prototype.find): Added. Adjust styles for iPad. * Animometer/resources/runner/animometer.css: (@media screen and (min-device-width: 768px)): Apply to iPad as well. (@media screen and (max-device-width: 1024px)): Update width for iPads. Adjustment styles for iOS. * Animometer/developer.html: Different divs contain the iframe, so use a class instead and update the style rules. * Animometer/index.html: * Animometer/resources/debug-runner/animometer.css: Remove extraneous rules. (@media screen and (min-device-width: 1800px)): Move this up. * Animometer/resources/runner/animometer.css: Add rules to accomodate iOS. Get rid of prefixed flex properties for now. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/runner/animometer.css: Update the structure of the harness. Remove the JSON-per-test but keep the JSON of the whole test run. Use the full page in order to display the graph. * Animometer/developer.html: Update several of the JS file includes to UTF-8. Remove header and footer. Test results screen includes score, average, and worst 5% statistics. * Animometer/index.html: Make structure similar to developer.html. * Animometer/resources/debug-runner/animometer.css: Remove most of the button rules since they are superfluous. Move the progress bar to the top, fixed. Update the results page rules. * Animometer/resources/debug-runner/animometer.js: Remove most of the additions to sectionsManager since they are no longer needed. (setSectionHeader): Updates header of the section. (window.suitesManager._updateStartButtonState): Update selector. (showResults): Add keypress event for selecting different data for copy/paste. Update how the results are populated. Include full test JSON in a textarea, rather than requiring a button press. (showTestGraph): * Animometer/resources/debug-runner/tests.js: Update structure of Headers. Define different kinds of headers. Headers can control their title, and the text used as the cell contents, including class name. * Animometer/resources/extensions.js: (ResultsTable): Update to include a flattened version of the headers, used while populating table contents. Remove unneeded helper functions for creating the table. Rename "show" to "add". * Animometer/resources/runner/animometer.css: Update rules to accommodate the new structure. * Animometer/resources/runner/animometer.js: (window.sectionsManager.setSectionScore): Helper function to set the score and mean for a section. (window.sectionsManager.populateTable): Helper function to set the table. (window.benchmarkController.showResults): Refactor. (window.benchmarkController.selectResults): Update selectors. * Animometer/resources/runner/tests.js: Set Headers. Debug harness extends it. Update debug runner to have similar names to the basic runner. Include that page's CSS and remove extraneous CSS rules. Get rid of the statistics table #record. * Animometer/developer.html: Rename #home to #intro. Rename .spacer to hr. * Animometer/resources/debug-runner/animometer.css: Set to flexbox when selected. * Animometer/resources/debug-runner/animometer.js: Remove recordTable. (window.suitesManager._updateStartButtonState): Update selector to #intro. (setupRunningSectionStyle): Deleted. * Animometer/resources/runner/animometer.css: (#test-container.selected): Change to flex-box only when visible. Remove recordTable. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): * Animometer/resources/runner/tests.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * Animometer/tests/examples/resources/canvas-electrons.js: * Animometer/tests/examples/resources/canvas-stars.js: * Animometer/tests/misc/resources/compositing-transforms.js: * Animometer/tests/resources/main.js: * Animometer/tests/resources/stage.js: (StageBenchmark): Remove _recordTable. * Animometer/tests/simple/resources/simple-canvas-paths.js: * Animometer/tests/simple/resources/simple-canvas.js: * Animometer/tests/template/resources/template-canvas.js: * Animometer/tests/template/resources/template-css.js: * Animometer/tests/template/resources/template-svg.js: * Animometer/tests/text/resources/layering-text.js: * Animometer/resources/debug-runner/animometer.js: (willStartFirstIteration): Fix selector, since results-table is used in multiple places, so it cannot be an id. Make it possible to select the scores, or the whole table data, by cycling through different selections through key press of 's'. * Animometer/resources/runner/animometer.js: (window.benchmarkController.showResults): Attach a keypress handler if it hasn't been added already. (window.benchmarkController.selectResults): * Animometer/resources/runner/tests.js: Cycle through different ranges. Fix a few fly-by errors. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunnerState.prototype.prepareCurrentTest): Update the frame relative path since the files are now in the top directory instead of inside runner/. (BenchmarkRunner.prototype._runTestAndRecordResults): Incorrect reference to function. (BenchmarkRunner.prototype.step): Member variable is never used. A little stylistic cleanup. * Animometer/resources/debug-runner/benchmark-runner.js: * Animometer/resources/extensions.js: (window.DocumentExtension.createElement): * Animometer/tests/resources/main.js: (Benchmark.prototype.record): * Animometer/tests/resources/stage.js: (StageBenchmark.prototype.showResults): Reverse progress and message. The message appears less frequently than the progress. * Animometer/tests/simple/resources/simple-canvas.js: (SimpleCanvasBenchmark): Remove unused options. Add newer version of harness in a new page. Consolidate differences between the two pages. * Animometer/developer.html: Include runner/animometer.js. Rename the JS function to run the benchmark to startBenchmark() instead of startTest(). Rename #running to #test-container. * Animometer/index.html: Added. Similarly calls startBenchmark() and has #test-container. * Animometer/resources/debug-runner/animometer.css: Make the canvas 2:1 (1200px x 800px) instead of 4:3. Split out benchmarkRunnerClient and benchmarkController. * Animometer/resources/debug-runner/animometer.js: Move needed functions out of benchmarkRunnerClient, and leave the rest here to extend that object. Get rid of _resultsTable and move populating the results table into benchmarkController. Rename _resultsDashboard to results and make it accessible for other objects to use. (willAddTestFrame): This is unnecessary. Remove. (window.sectionsManager.showScore): Grab it from the results object instead of benchmarkRunnerClient. (window.sectionsManager.showSection): Deleted. Moved to runner/animometer.js. (window.benchmarkController._runBenchmark): Deleted. Mostly moved into _startBenchmark. (window.benchmarkController.startBenchmark): Refactor to call _startBenchmark. (window.benchmarkController.showResults): Include most of benchmarkRunnerClient.didFinishLastIteration() here. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._appendFrame): Remove unneeded call to willAddTestFrame. * Animometer/resources/extensions.js: (ResultsDashboard): Change the class to process the sampler data on-demand and hold onto that data for later referencing. (ResultsDashboard.prototype.toJSON): Deleted. (ResultsDashboard.prototype._processData): Rename toJSON to _processData since it's not really outputting JSON. Store the processed data into a member variable that can be referenced later. (ResultsDashboard.prototype.get data): Process the data if it hasn't already. (ResultsDashboard.prototype.get score): Process the data if it hasn't already, then return the aggregate score. (ResultsTable.prototype._showHeader): When outputting the results to a table, don't force the need for an empty children array. This was to allow for a header row in the table that spanned multiple columns. In the simpler harness, this is not needed. (ResultsTable.prototype._showEmptyCells): (ResultsTable.prototype._showTest): This hardcoded the columns. At least for the name and score, which is the bare minimum needed for the simpler harness, key off of the header name provided. * Animometer/resources/runner/animometer.css: Added. Use a similar 2:1 ratio. The score tables are split into the data and the headers, and are also displayed RTL so that a later patch allows a user to copy-paste the data easily. * Animometer/resources/runner/animometer.js: Added. Use a simpler version of benchmarkRunnerClient. The debug harness will extend these classes. (window.benchmarkController._startBenchmark): Used by both harnesses. (window.benchmarkController.startBenchmark): Set hard-coded options. (window.benchmarkController.showResults): Includes most of benchmarkRunnerClient.didFinishLastIteration() here. Get rid of utilities.js. Move it all into extensions.js. * Animometer/resources/extensions.js: * Animometer/tests/resources/utilities.js: Removed. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: Remove script link. * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: Ditto. * Animometer/tests/examples/canvas-electrons.html: Ditto. * Animometer/tests/examples/canvas-stars.html: Ditto. * Animometer/tests/misc/compositing-transforms.html: Ditto. * Animometer/tests/simple/simple-canvas-paths.html: Ditto. * Animometer/tests/template/template-canvas.html: Ditto. * Animometer/tests/template/template-css.html: Ditto. * Animometer/tests/template/template-svg.html: Ditto. * Animometer/tests/text/layering-text.html: Ditto. Split tests.js into two. Add a new suite to runner/tests.js. * Animometer/developer.html: Update the script order. Scripts from debug-runner/ will always build on those from runner/, and have the same name. * Animometer/resources/debug-runner/tests.js: Move "complex examples" suite into "miscellaneous tests". (Suite): Deleted. (Suite.prototype.prepare): Deleted. (Suite.prototype.run): Deleted. (suiteFromName): Deleted. * Animometer/resources/runner/tests.js: Added. Take definitions and functions needed by the test harness. Leave the test suites behind. (Suite): Moved from debug script. (Suite.prototype.prepare): Ditto. (Suite.prototype.run): Ditto. (suiteFromName): Ditto. (testFromName): Ditto. Move benchmark resources out into resources/debug-runner, and update URLs. * Animometer/developer.html: Renamed from PerformanceTests/Animometer/runner/animometer.html. * Animometer/resources/debug-runner/animometer.css: Renamed from PerformanceTests/Animometer/runner/resources/animometer.css. * Animometer/resources/debug-runner/animometer.js: Renamed from PerformanceTests/Animometer/runner/resources/animometer.js. * Animometer/resources/debug-runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/runner/resources/benchmark-runner.js. * Animometer/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/runner/resources/d3.min.js. * Animometer/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/runner/resources/graph.js. * Animometer/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/runner/resources/tests.js. Canonical link: https://commits.webkit.org/170645@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@194407 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2015-12-24 01:31:47 +00:00
<main>
<section id="intro" class="selected">
[MotionMark] Add support for version numbers https://bugs.webkit.org/show_bug.cgi?id=186479 Reviewed by Said Abou-Hallawa. Add support for displaying the version number as well as including it in the JSON results. When loading the front page, script replaces any element with classname version with the version number of the benchmark, which is stored in Strings.version. The JSON structure for the results includes a new version property: { "version": "1.0", "options": { ... }, "data": [ ... ] } When dragging a results file, the version listed will come from the JSON file. Older results will not have had the version property, in which case it will default to "1.0". * MotionMark/index.html: Update title to some other default. Script will update it. Include the version number in the logo title. * MotionMark/developer.html: Ditto. * MotionMark/about.html: Ditto. * MotionMark/resources/runner/motionmark.js: (ResultsDashboard): Update constructor to include version. This is used when serializing results out to JSON, and displaying the results panel in developer mode. (ResultsDashboard._processData): When running the benchmark, include benchmark version string in the results object. (ResultsDashboard.version): (window.benchmarkRunnerClient.willStartFirstIteration): When running the benchmark, pass the benchmark version string to the dashboard, which holds the results. (window.sectionsManager.setSectionVersion): Helper function to update the element in the section with the class name version. (window.benchmarkController.initialize): Populate all DOM elements with class name "version" with the version string. Update the page title. (window.benchmarkController.showResults): When showing results, update the version string based on what is included in the JSON results, which would be the same as the benchmark version. * MotionMark/resources/runner/motionmark.css: Include missing copyright. Wrap the SVG logo in a div and include the version string. * MotionMark/resources/strings.js: Add strings for the page title template, and the version. * MotionMark/resources/debug-runner/motionmark.css: * MotionMark/resources/debug-runner/motionmark.js: (window.benchmarkRunnerClient.willStartFirstIteration): When running the benchmark, pass the benchmark version string to the dashboard, which holds the results. (window.benchmarkController.initialize): Populate all DOM elements with class name "version" with the version string. Update the page title. When dragging in JSON results, look for version to pass to the dashboard. If it doesn't exist, default to "1.0". (window.benchmarkController.showResults): When showing results, update the version string based on what is included in the JSON results, instead of the current benchmark version. * MotionMark/resources/debug-runner/tests.js: Update page title template. Canonical link: https://commits.webkit.org/202238@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@233147 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2018-06-25 15:38:03 +00:00
<header>
<h1>MotionMark</h1>
<h2>version <span class="version"></span></h2>
</header>
Improvements to Animometer benchmark https://bugs.webkit.org/show_bug.cgi?id=157738 Reviewed by Dean Jackson. Provisionally reviewed by Said Abou-Hallawa. Include confidence interval for the final score, and store the canvas size in the serialization so that it is accurately shown in results. * Animometer/developer.html: Add a "confidence" div. * Animometer/index.html: Ditto. Convert "mean" to "confidence". * Animometer/resources/debug-runner/animometer.js: Look at options, and if the configuration is included, update the body class based on it (similar to what we do when we first load the page). That way, if you drag-and-drop previous results in, that configuration is reflected in the dashboard. Show the full confidence interval. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Style update. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: (_processData): Propagate the confidence interval values out and calculate the lower and upper bounds. For now, shortcut the aggregate calculation, since we only go through one iteration. (this._processData.calculateScore): Propagate the confidence interval out to be next to the score. Depending on the controller these values are calculated differently. (this._processData._getResultsProperty): Convenience function. (this._processData.get score): Refactor. (this._processData.get scoreLowerBound): Get the aggregate lower bound. (this._processData.get scoreUpperBound): Get the aggregate upper bound. (window.sectionsManager.setSectionScore): (window.benchmarkController._startBenchmark): When the benchmark starts, note the canvas size and add it to options. That way it will be automatically be serialized. (window.benchmarkController.showResults): Include the maximum deviation percentage. * Animometer/resources/runner/lines.svg: Make the background line up with the skew. * Animometer/resources/runner/tests.js: (Headers.details.text): Refactor. * Animometer/resources/statistics.js: (Statistics.largestDeviationPercentage): Convenience function to calculate the deviation percentage on either end and return the largest deviation. * Animometer/resources/strings.js: Allow specifying a regression profile to use instead of taking the one with the lowest error. Address an issue in the focus test when the regression calculated ends up overestimating the change point, causing a cascade of tougher ramps. The reason behind this is that at max complexity of an initial ramp, the frame length is very high, and it influences the second segment of the piecewise regression strongly, causing it to be very steep. As a result, the first segment, expected to be flat, ends up covering a higher range of complexity. That makes the change point much higher than it should be. To avoid this, we will add a sanity check on the maximum value of the ramp. If the regression's projected value at the maximum complexity of the current ramp is very slow (less than 20 fps), 1) reduce the maximum complexity by 20%, and 2) do not include the regression's change point in the change point estimator. That estimator is used as the midpoint of the next ramp, and including the change point from a poor regression can bake in the error. The controller already knows how to adjust for ramps that are too easy for the system. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Takes a preferred profile and gives that to Regression. (this._processData.calculateScore): With the ramp controller, take the profile of the ramp that was used the most when calculating the ramp's regression. That profile is what is used for the test's score. * Animometer/resources/statistics.js: (Regression.Utilities.createClass): Update to take an options object which can specify a profile to calculate with. Otherwise it will continue to use both and select the one with the lower error. (_calculateRegression): Fix an issue where we claim 0 error if the regression calculation fails due to divide-by-zero. Instead reject that regression calculation by giving it Number.MAX_VALUE. * Animometer/resources/strings.js: New strings for marking a regression as flat or slope. * Animometer/tests/resources/main.js: (RampController): Rename the thresholds for clarity. Add a threshold that, if exceeded, will lower the maximum complexity of the next ramp. (tune): Relax the cdf check to consider whether the interval definitely falls in the desired frame length threshold. (processSamples): Include the profile in the ramp. Update ramp controller test. Increase the length of the test to 30 seconds, and extend the interval to 120 ms during sampling. Improve the estimation of the ramp parameters. * Animometer/developer.html: Change default to 30 seconds, and don't show the progress bar by default. * Animometer/resources/runner/animometer.js: Change default to 30 seconds. * Animometer/tests/resources/main.js: A number of improvements to the ramp controller, in the order in which they appear in the patch: - With a longer test length use longer ramps with longer intervals to get more data at each complexity. Keep the 100 ms interval length during the ramp up phase since we don't need to spend more time there to find the right order of magnitude, but increase it during the ramps to 120 ms. - The ramp linearly interpolates the complexity to render based on its timestamp, but it would never sample the minimum complexity. Instead of lerping max to min complexity from time 0 to t where t is the ramp length, instead lerp from 0 to (t - intervalSampleLength) so that we can have at least one interval sample at the min complexity for that ramp. - Some regression calculations only come out with one line segment rather than the two we expect. This could be due to a noisy ramp or the ramp's range is too narrow. If that's the case, influence the minimum complexity of the next ramp towards the lowest bound of 1, so that we ensure that at least part of the ramp is covering a complexity range that the system can handle at full 60. - Remove an assignment to interpolatedFrameLength since that is never subsequently used. Update the format used to serialize the results for analysis. Each data point used to have named properties for fields like complexity and frame rate. In addition the serialized numbers had rounding errors that took up many characters. Update the format by introducing a new data container called SampleData, which contains a field map. The map maps a string to an array index. Each data point is an array, so, to get a stat, use the field map to get the array index into the data point. This allows future versions to track other data, and reduces the size of the output string by two-thirds. * Animometer/resources/extensions.js: (Utilities.toFixedNumber): Add convenience function that truncates the number to a fixed precision, and converts it back to a number. (SampleData): New class that contains sample data and a field map that maps properties to an array index. (get length): Number of data points. (addField): Add a field to the field map. (push): Add a data point. (sort): Sort the data. (slice): Return new SampleData object with sliced data. (forEach): Iterate over the data with provided function. (createDatum): (getFieldInDatum): Returns the data point associated with the field name by looking it up in the field map in the datum provided, which can be the datum object itself (an array) or an index into the data member variable. (setFieldInDatum): Sets the data point associated with the field name. (at): Returns the data point at the provided index. (toArray): Serializes the data where the field map serves as property names for each point. * Animometer/resources/debug-runner/graph.js: (updateGraphData): Remove unused _testData. Convert the data to the old array format for the graph to use, since the old format was much easier to work with when displaying the graphs. (onGraphTypeChanged): For some controllers, no alternative score or mean is provided. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: Refactor to use SampleData. Update JSON output to only go to 3 digits of precision for purposes of reducing the data size. * Animometer/resources/strings.js: Add new strings to put into the field maps. * Animometer/tests/resources/main.js: Refactor to use SampleData. * Animometer/developer.html: * Animometer/index.html: Restructure results table for both pages. Add charset attribute to tests.js include. * Animometer/resources/debug-runner/animometer.css: Clear out styles from release runner. * Animometer/resources/debug-runner/graph.js: (onGraphTypeChanged): Update score and mean if bootstrap results are available from the controller, since not all controllers do bootstrapping. * Animometer/resources/debug-runner/tests.js: Update header text. * Animometer/resources/runner/animometer.css: Include confidence interval in results. * Animometer/resources/runner/animometer.js: (ResultsTable._addHeader): Header contents can be HTML, so use innerHTML instead. (ResultsTable._addBody): Add tbody element. (ResultsTable._addTest): Allow a data cell to invoke a JS function to get its contents. (window.benchmarkController.showResults): Add table that includes tests' confidence intervals. * Animometer/resources/runner/tests.js: (Headers.details.text): Add new details table that includes bootstrap confidence interval. The interval can be asymmetric, but for simplicity, report the maximum deviation percentage on either side of the bootstrap median. * Animometer/resources/statistics.js: (bootstrap): Include the confidence percentage in the return object. Report canvas size in results. * Animometer/developer.html: Add markup to indicate whether a small, medium, or large canvas was used. * Animometer/index.html: Ditto. * Animometer/resources/debug-runner/animometer.js: Call determineCanvasSize(). * Animometer/resources/runner/animometer.css: Update styles to set the canvas based on the body class size. * Animometer/resources/runner/animometer.js: (window.benchmarkController.initialize): Update styles to set the canvas based on the body class size. (window.benchmarkController.determineCanvasSize): Run various media queries and set the body class based on the size of the device. * Animometer/developer.html: Refactor to include the main CSS file, and redo the layout so that it doesn't rely on flexbox. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: (updateDisplay): Since various parts of the script alter the body class, we can't replace the className directly. Instead, remove all display-based values and then add the one that was selected. * Animometer/resources/debug-runner/graph.js: (updateGraphData): To set the size of the graph, use window.innerHeight. * Animometer/resources/runner/animometer.js: (window.sectionsManager.showSection): Since various parts of the script alter the body class, we can't replace the className directly. Remove all of the section classes individually and then add the one desired. * Animometer/tests/resources/stage.css: Remove -apple-system as a font to use in the stage. Canonical link: https://commits.webkit.org/177081@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@202314 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-06-22 03:11:15 +00:00
<div class="body">
<div>
<div id="suites">
<h2>Suites:</h2>
<ul class="tree"></ul>
<div><span id="drop-target">Drop results here</span></div>
</div>
<div id="options">
<h2>Options:</h2>
<form name="benchmark-options">
<ul>
MotionMark: ensure that timestamps are valid during warm up phase of tests https://bugs.webkit.org/show_bug.cgi?id=210640 Reviewed by Said Abou-Hallawa. Ensure that Benchmark._benchmarkStartTimestamp is set during warm up phase. Otherwise it is NaN, which makes the Benchmark.timestamp invalid, which is used by tests like Multiply to drive the animation. When the warm up phase completes, the start timestamp is reset. Update minor version of benchmark with this bug fix, and include version changelog in the about page. For testing, add a parameter that allows for adjusting the length of the warm up phase. It remains at its current default, 100 ms. * MotionMark/about.html: Add section of version changelog. Includes links to webkit.org blog posts. * MotionMark/developer.html: Add parameter for setting warmup length. Remove the Kalman filter parameters, since they should always be fixed. * MotionMark/resources/runner/motionmark.css: Include styles to show version log. * MotionMark/resources/runner/motionmark.js: Factor out default options to a property on window.benchmarkController. Include the default warmup length of 100 ms. (window.benchmarkController.startBenchmark): Refactor to use benchmarkDefaultParameters. * MotionMark/resources/debug-runner/motionmark.js: Ditto. * MotionMark/resources/strings.js: Update version number. * MotionMark/tests/resources/main.js: (_animateLoop): Set _benchmarkTimestamp during the warmup phase. Check the warmup length. The _benchmarkTimestamp variable remains reset when the test begins. Canonical link: https://commits.webkit.org/223865@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@260656 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2020-04-24 17:58:55 +00:00
<li>
<label>Warmup length: <input type="number" id="warmup-length" value="2000"> milliseconds</label>
</li>
<li>
<label>Warmup frame count: <input type="number" id="warmup-frame-count" value="30"> frames</label>
</li>
<li>
<label>First frame minimum length: <input type="number" id="first-frame-minimum-length" value="0"> ms</label>
MotionMark: ensure that timestamps are valid during warm up phase of tests https://bugs.webkit.org/show_bug.cgi?id=210640 Reviewed by Said Abou-Hallawa. Ensure that Benchmark._benchmarkStartTimestamp is set during warm up phase. Otherwise it is NaN, which makes the Benchmark.timestamp invalid, which is used by tests like Multiply to drive the animation. When the warm up phase completes, the start timestamp is reset. Update minor version of benchmark with this bug fix, and include version changelog in the about page. For testing, add a parameter that allows for adjusting the length of the warm up phase. It remains at its current default, 100 ms. * MotionMark/about.html: Add section of version changelog. Includes links to webkit.org blog posts. * MotionMark/developer.html: Add parameter for setting warmup length. Remove the Kalman filter parameters, since they should always be fixed. * MotionMark/resources/runner/motionmark.css: Include styles to show version log. * MotionMark/resources/runner/motionmark.js: Factor out default options to a property on window.benchmarkController. Include the default warmup length of 100 ms. (window.benchmarkController.startBenchmark): Refactor to use benchmarkDefaultParameters. * MotionMark/resources/debug-runner/motionmark.js: Ditto. * MotionMark/resources/strings.js: Update version number. * MotionMark/tests/resources/main.js: (_animateLoop): Set _benchmarkTimestamp during the warmup phase. Check the warmup length. The _benchmarkTimestamp variable remains reset when the test begins. Canonical link: https://commits.webkit.org/223865@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@260656 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2020-04-24 17:58:55 +00:00
</li>
Improvements to Animometer benchmark https://bugs.webkit.org/show_bug.cgi?id=157738 Reviewed by Dean Jackson. Provisionally reviewed by Said Abou-Hallawa. Include confidence interval for the final score, and store the canvas size in the serialization so that it is accurately shown in results. * Animometer/developer.html: Add a "confidence" div. * Animometer/index.html: Ditto. Convert "mean" to "confidence". * Animometer/resources/debug-runner/animometer.js: Look at options, and if the configuration is included, update the body class based on it (similar to what we do when we first load the page). That way, if you drag-and-drop previous results in, that configuration is reflected in the dashboard. Show the full confidence interval. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Style update. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: (_processData): Propagate the confidence interval values out and calculate the lower and upper bounds. For now, shortcut the aggregate calculation, since we only go through one iteration. (this._processData.calculateScore): Propagate the confidence interval out to be next to the score. Depending on the controller these values are calculated differently. (this._processData._getResultsProperty): Convenience function. (this._processData.get score): Refactor. (this._processData.get scoreLowerBound): Get the aggregate lower bound. (this._processData.get scoreUpperBound): Get the aggregate upper bound. (window.sectionsManager.setSectionScore): (window.benchmarkController._startBenchmark): When the benchmark starts, note the canvas size and add it to options. That way it will be automatically be serialized. (window.benchmarkController.showResults): Include the maximum deviation percentage. * Animometer/resources/runner/lines.svg: Make the background line up with the skew. * Animometer/resources/runner/tests.js: (Headers.details.text): Refactor. * Animometer/resources/statistics.js: (Statistics.largestDeviationPercentage): Convenience function to calculate the deviation percentage on either end and return the largest deviation. * Animometer/resources/strings.js: Allow specifying a regression profile to use instead of taking the one with the lowest error. Address an issue in the focus test when the regression calculated ends up overestimating the change point, causing a cascade of tougher ramps. The reason behind this is that at max complexity of an initial ramp, the frame length is very high, and it influences the second segment of the piecewise regression strongly, causing it to be very steep. As a result, the first segment, expected to be flat, ends up covering a higher range of complexity. That makes the change point much higher than it should be. To avoid this, we will add a sanity check on the maximum value of the ramp. If the regression's projected value at the maximum complexity of the current ramp is very slow (less than 20 fps), 1) reduce the maximum complexity by 20%, and 2) do not include the regression's change point in the change point estimator. That estimator is used as the midpoint of the next ramp, and including the change point from a poor regression can bake in the error. The controller already knows how to adjust for ramps that are too easy for the system. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Takes a preferred profile and gives that to Regression. (this._processData.calculateScore): With the ramp controller, take the profile of the ramp that was used the most when calculating the ramp's regression. That profile is what is used for the test's score. * Animometer/resources/statistics.js: (Regression.Utilities.createClass): Update to take an options object which can specify a profile to calculate with. Otherwise it will continue to use both and select the one with the lower error. (_calculateRegression): Fix an issue where we claim 0 error if the regression calculation fails due to divide-by-zero. Instead reject that regression calculation by giving it Number.MAX_VALUE. * Animometer/resources/strings.js: New strings for marking a regression as flat or slope. * Animometer/tests/resources/main.js: (RampController): Rename the thresholds for clarity. Add a threshold that, if exceeded, will lower the maximum complexity of the next ramp. (tune): Relax the cdf check to consider whether the interval definitely falls in the desired frame length threshold. (processSamples): Include the profile in the ramp. Update ramp controller test. Increase the length of the test to 30 seconds, and extend the interval to 120 ms during sampling. Improve the estimation of the ramp parameters. * Animometer/developer.html: Change default to 30 seconds, and don't show the progress bar by default. * Animometer/resources/runner/animometer.js: Change default to 30 seconds. * Animometer/tests/resources/main.js: A number of improvements to the ramp controller, in the order in which they appear in the patch: - With a longer test length use longer ramps with longer intervals to get more data at each complexity. Keep the 100 ms interval length during the ramp up phase since we don't need to spend more time there to find the right order of magnitude, but increase it during the ramps to 120 ms. - The ramp linearly interpolates the complexity to render based on its timestamp, but it would never sample the minimum complexity. Instead of lerping max to min complexity from time 0 to t where t is the ramp length, instead lerp from 0 to (t - intervalSampleLength) so that we can have at least one interval sample at the min complexity for that ramp. - Some regression calculations only come out with one line segment rather than the two we expect. This could be due to a noisy ramp or the ramp's range is too narrow. If that's the case, influence the minimum complexity of the next ramp towards the lowest bound of 1, so that we ensure that at least part of the ramp is covering a complexity range that the system can handle at full 60. - Remove an assignment to interpolatedFrameLength since that is never subsequently used. Update the format used to serialize the results for analysis. Each data point used to have named properties for fields like complexity and frame rate. In addition the serialized numbers had rounding errors that took up many characters. Update the format by introducing a new data container called SampleData, which contains a field map. The map maps a string to an array index. Each data point is an array, so, to get a stat, use the field map to get the array index into the data point. This allows future versions to track other data, and reduces the size of the output string by two-thirds. * Animometer/resources/extensions.js: (Utilities.toFixedNumber): Add convenience function that truncates the number to a fixed precision, and converts it back to a number. (SampleData): New class that contains sample data and a field map that maps properties to an array index. (get length): Number of data points. (addField): Add a field to the field map. (push): Add a data point. (sort): Sort the data. (slice): Return new SampleData object with sliced data. (forEach): Iterate over the data with provided function. (createDatum): (getFieldInDatum): Returns the data point associated with the field name by looking it up in the field map in the datum provided, which can be the datum object itself (an array) or an index into the data member variable. (setFieldInDatum): Sets the data point associated with the field name. (at): Returns the data point at the provided index. (toArray): Serializes the data where the field map serves as property names for each point. * Animometer/resources/debug-runner/graph.js: (updateGraphData): Remove unused _testData. Convert the data to the old array format for the graph to use, since the old format was much easier to work with when displaying the graphs. (onGraphTypeChanged): For some controllers, no alternative score or mean is provided. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: Refactor to use SampleData. Update JSON output to only go to 3 digits of precision for purposes of reducing the data size. * Animometer/resources/strings.js: Add new strings to put into the field maps. * Animometer/tests/resources/main.js: Refactor to use SampleData. * Animometer/developer.html: * Animometer/index.html: Restructure results table for both pages. Add charset attribute to tests.js include. * Animometer/resources/debug-runner/animometer.css: Clear out styles from release runner. * Animometer/resources/debug-runner/graph.js: (onGraphTypeChanged): Update score and mean if bootstrap results are available from the controller, since not all controllers do bootstrapping. * Animometer/resources/debug-runner/tests.js: Update header text. * Animometer/resources/runner/animometer.css: Include confidence interval in results. * Animometer/resources/runner/animometer.js: (ResultsTable._addHeader): Header contents can be HTML, so use innerHTML instead. (ResultsTable._addBody): Add tbody element. (ResultsTable._addTest): Allow a data cell to invoke a JS function to get its contents. (window.benchmarkController.showResults): Add table that includes tests' confidence intervals. * Animometer/resources/runner/tests.js: (Headers.details.text): Add new details table that includes bootstrap confidence interval. The interval can be asymmetric, but for simplicity, report the maximum deviation percentage on either side of the bootstrap median. * Animometer/resources/statistics.js: (bootstrap): Include the confidence percentage in the return object. Report canvas size in results. * Animometer/developer.html: Add markup to indicate whether a small, medium, or large canvas was used. * Animometer/index.html: Ditto. * Animometer/resources/debug-runner/animometer.js: Call determineCanvasSize(). * Animometer/resources/runner/animometer.css: Update styles to set the canvas based on the body class size. * Animometer/resources/runner/animometer.js: (window.benchmarkController.initialize): Update styles to set the canvas based on the body class size. (window.benchmarkController.determineCanvasSize): Run various media queries and set the body class based on the size of the device. * Animometer/developer.html: Refactor to include the main CSS file, and redo the layout so that it doesn't rely on flexbox. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: (updateDisplay): Since various parts of the script alter the body class, we can't replace the className directly. Instead, remove all display-based values and then add the one that was selected. * Animometer/resources/debug-runner/graph.js: (updateGraphData): To set the size of the graph, use window.innerHeight. * Animometer/resources/runner/animometer.js: (window.sectionsManager.showSection): Since various parts of the script alter the body class, we can't replace the className directly. Remove all of the section classes individually and then add the one desired. * Animometer/tests/resources/stage.css: Remove -apple-system as a font to use in the stage. Canonical link: https://commits.webkit.org/177081@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@202314 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-06-22 03:11:15 +00:00
<li>
<label>Test length: <input type="number" id="test-interval" value="30"> seconds each</label>
</li>
<li>
<h3>Display:</h3>
<ul>
<li><label><input name="display" type="radio" value="minimal" checked> Minimal</label></li>
<li><label><input name="display" type="radio" value="progress-bar"> Progress bar</label></li>
</ul>
</li>
<li>
<h3>Tiles:</h3>
<ul>
<li><label><input name="tiles" type="radio" value="big" checked> Big tiles</label></li>
<li><label><input name="tiles" type="radio" value="classic"> Classic tiles (512x512)</label></li>
</ul>
</li>
Improvements to Animometer benchmark https://bugs.webkit.org/show_bug.cgi?id=157738 Reviewed by Dean Jackson. Provisionally reviewed by Said Abou-Hallawa. Include confidence interval for the final score, and store the canvas size in the serialization so that it is accurately shown in results. * Animometer/developer.html: Add a "confidence" div. * Animometer/index.html: Ditto. Convert "mean" to "confidence". * Animometer/resources/debug-runner/animometer.js: Look at options, and if the configuration is included, update the body class based on it (similar to what we do when we first load the page). That way, if you drag-and-drop previous results in, that configuration is reflected in the dashboard. Show the full confidence interval. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Style update. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: (_processData): Propagate the confidence interval values out and calculate the lower and upper bounds. For now, shortcut the aggregate calculation, since we only go through one iteration. (this._processData.calculateScore): Propagate the confidence interval out to be next to the score. Depending on the controller these values are calculated differently. (this._processData._getResultsProperty): Convenience function. (this._processData.get score): Refactor. (this._processData.get scoreLowerBound): Get the aggregate lower bound. (this._processData.get scoreUpperBound): Get the aggregate upper bound. (window.sectionsManager.setSectionScore): (window.benchmarkController._startBenchmark): When the benchmark starts, note the canvas size and add it to options. That way it will be automatically be serialized. (window.benchmarkController.showResults): Include the maximum deviation percentage. * Animometer/resources/runner/lines.svg: Make the background line up with the skew. * Animometer/resources/runner/tests.js: (Headers.details.text): Refactor. * Animometer/resources/statistics.js: (Statistics.largestDeviationPercentage): Convenience function to calculate the deviation percentage on either end and return the largest deviation. * Animometer/resources/strings.js: Allow specifying a regression profile to use instead of taking the one with the lowest error. Address an issue in the focus test when the regression calculated ends up overestimating the change point, causing a cascade of tougher ramps. The reason behind this is that at max complexity of an initial ramp, the frame length is very high, and it influences the second segment of the piecewise regression strongly, causing it to be very steep. As a result, the first segment, expected to be flat, ends up covering a higher range of complexity. That makes the change point much higher than it should be. To avoid this, we will add a sanity check on the maximum value of the ramp. If the regression's projected value at the maximum complexity of the current ramp is very slow (less than 20 fps), 1) reduce the maximum complexity by 20%, and 2) do not include the regression's change point in the change point estimator. That estimator is used as the midpoint of the next ramp, and including the change point from a poor regression can bake in the error. The controller already knows how to adjust for ramps that are too easy for the system. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Takes a preferred profile and gives that to Regression. (this._processData.calculateScore): With the ramp controller, take the profile of the ramp that was used the most when calculating the ramp's regression. That profile is what is used for the test's score. * Animometer/resources/statistics.js: (Regression.Utilities.createClass): Update to take an options object which can specify a profile to calculate with. Otherwise it will continue to use both and select the one with the lower error. (_calculateRegression): Fix an issue where we claim 0 error if the regression calculation fails due to divide-by-zero. Instead reject that regression calculation by giving it Number.MAX_VALUE. * Animometer/resources/strings.js: New strings for marking a regression as flat or slope. * Animometer/tests/resources/main.js: (RampController): Rename the thresholds for clarity. Add a threshold that, if exceeded, will lower the maximum complexity of the next ramp. (tune): Relax the cdf check to consider whether the interval definitely falls in the desired frame length threshold. (processSamples): Include the profile in the ramp. Update ramp controller test. Increase the length of the test to 30 seconds, and extend the interval to 120 ms during sampling. Improve the estimation of the ramp parameters. * Animometer/developer.html: Change default to 30 seconds, and don't show the progress bar by default. * Animometer/resources/runner/animometer.js: Change default to 30 seconds. * Animometer/tests/resources/main.js: A number of improvements to the ramp controller, in the order in which they appear in the patch: - With a longer test length use longer ramps with longer intervals to get more data at each complexity. Keep the 100 ms interval length during the ramp up phase since we don't need to spend more time there to find the right order of magnitude, but increase it during the ramps to 120 ms. - The ramp linearly interpolates the complexity to render based on its timestamp, but it would never sample the minimum complexity. Instead of lerping max to min complexity from time 0 to t where t is the ramp length, instead lerp from 0 to (t - intervalSampleLength) so that we can have at least one interval sample at the min complexity for that ramp. - Some regression calculations only come out with one line segment rather than the two we expect. This could be due to a noisy ramp or the ramp's range is too narrow. If that's the case, influence the minimum complexity of the next ramp towards the lowest bound of 1, so that we ensure that at least part of the ramp is covering a complexity range that the system can handle at full 60. - Remove an assignment to interpolatedFrameLength since that is never subsequently used. Update the format used to serialize the results for analysis. Each data point used to have named properties for fields like complexity and frame rate. In addition the serialized numbers had rounding errors that took up many characters. Update the format by introducing a new data container called SampleData, which contains a field map. The map maps a string to an array index. Each data point is an array, so, to get a stat, use the field map to get the array index into the data point. This allows future versions to track other data, and reduces the size of the output string by two-thirds. * Animometer/resources/extensions.js: (Utilities.toFixedNumber): Add convenience function that truncates the number to a fixed precision, and converts it back to a number. (SampleData): New class that contains sample data and a field map that maps properties to an array index. (get length): Number of data points. (addField): Add a field to the field map. (push): Add a data point. (sort): Sort the data. (slice): Return new SampleData object with sliced data. (forEach): Iterate over the data with provided function. (createDatum): (getFieldInDatum): Returns the data point associated with the field name by looking it up in the field map in the datum provided, which can be the datum object itself (an array) or an index into the data member variable. (setFieldInDatum): Sets the data point associated with the field name. (at): Returns the data point at the provided index. (toArray): Serializes the data where the field map serves as property names for each point. * Animometer/resources/debug-runner/graph.js: (updateGraphData): Remove unused _testData. Convert the data to the old array format for the graph to use, since the old format was much easier to work with when displaying the graphs. (onGraphTypeChanged): For some controllers, no alternative score or mean is provided. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: Refactor to use SampleData. Update JSON output to only go to 3 digits of precision for purposes of reducing the data size. * Animometer/resources/strings.js: Add new strings to put into the field maps. * Animometer/tests/resources/main.js: Refactor to use SampleData. * Animometer/developer.html: * Animometer/index.html: Restructure results table for both pages. Add charset attribute to tests.js include. * Animometer/resources/debug-runner/animometer.css: Clear out styles from release runner. * Animometer/resources/debug-runner/graph.js: (onGraphTypeChanged): Update score and mean if bootstrap results are available from the controller, since not all controllers do bootstrapping. * Animometer/resources/debug-runner/tests.js: Update header text. * Animometer/resources/runner/animometer.css: Include confidence interval in results. * Animometer/resources/runner/animometer.js: (ResultsTable._addHeader): Header contents can be HTML, so use innerHTML instead. (ResultsTable._addBody): Add tbody element. (ResultsTable._addTest): Allow a data cell to invoke a JS function to get its contents. (window.benchmarkController.showResults): Add table that includes tests' confidence intervals. * Animometer/resources/runner/tests.js: (Headers.details.text): Add new details table that includes bootstrap confidence interval. The interval can be asymmetric, but for simplicity, report the maximum deviation percentage on either side of the bootstrap median. * Animometer/resources/statistics.js: (bootstrap): Include the confidence percentage in the return object. Report canvas size in results. * Animometer/developer.html: Add markup to indicate whether a small, medium, or large canvas was used. * Animometer/index.html: Ditto. * Animometer/resources/debug-runner/animometer.js: Call determineCanvasSize(). * Animometer/resources/runner/animometer.css: Update styles to set the canvas based on the body class size. * Animometer/resources/runner/animometer.js: (window.benchmarkController.initialize): Update styles to set the canvas based on the body class size. (window.benchmarkController.determineCanvasSize): Run various media queries and set the body class based on the size of the device. * Animometer/developer.html: Refactor to include the main CSS file, and redo the layout so that it doesn't rely on flexbox. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: (updateDisplay): Since various parts of the script alter the body class, we can't replace the className directly. Instead, remove all display-based values and then add the one that was selected. * Animometer/resources/debug-runner/graph.js: (updateGraphData): To set the size of the graph, use window.innerHeight. * Animometer/resources/runner/animometer.js: (window.sectionsManager.showSection): Since various parts of the script alter the body class, we can't replace the className directly. Remove all of the section classes individually and then add the one desired. * Animometer/tests/resources/stage.css: Remove -apple-system as a font to use in the stage. Canonical link: https://commits.webkit.org/177081@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@202314 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-06-22 03:11:15 +00:00
<li>
<h3>Adjusting the test complexity:</h3>
<ul>
<li><label><input name="controller" type="radio" value="ramp" checked> Ramp</label></li>
Improvements to Animometer benchmark https://bugs.webkit.org/show_bug.cgi?id=157738 Reviewed by Dean Jackson. Provisionally reviewed by Said Abou-Hallawa. Include confidence interval for the final score, and store the canvas size in the serialization so that it is accurately shown in results. * Animometer/developer.html: Add a "confidence" div. * Animometer/index.html: Ditto. Convert "mean" to "confidence". * Animometer/resources/debug-runner/animometer.js: Look at options, and if the configuration is included, update the body class based on it (similar to what we do when we first load the page). That way, if you drag-and-drop previous results in, that configuration is reflected in the dashboard. Show the full confidence interval. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Style update. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: (_processData): Propagate the confidence interval values out and calculate the lower and upper bounds. For now, shortcut the aggregate calculation, since we only go through one iteration. (this._processData.calculateScore): Propagate the confidence interval out to be next to the score. Depending on the controller these values are calculated differently. (this._processData._getResultsProperty): Convenience function. (this._processData.get score): Refactor. (this._processData.get scoreLowerBound): Get the aggregate lower bound. (this._processData.get scoreUpperBound): Get the aggregate upper bound. (window.sectionsManager.setSectionScore): (window.benchmarkController._startBenchmark): When the benchmark starts, note the canvas size and add it to options. That way it will be automatically be serialized. (window.benchmarkController.showResults): Include the maximum deviation percentage. * Animometer/resources/runner/lines.svg: Make the background line up with the skew. * Animometer/resources/runner/tests.js: (Headers.details.text): Refactor. * Animometer/resources/statistics.js: (Statistics.largestDeviationPercentage): Convenience function to calculate the deviation percentage on either end and return the largest deviation. * Animometer/resources/strings.js: Allow specifying a regression profile to use instead of taking the one with the lowest error. Address an issue in the focus test when the regression calculated ends up overestimating the change point, causing a cascade of tougher ramps. The reason behind this is that at max complexity of an initial ramp, the frame length is very high, and it influences the second segment of the piecewise regression strongly, causing it to be very steep. As a result, the first segment, expected to be flat, ends up covering a higher range of complexity. That makes the change point much higher than it should be. To avoid this, we will add a sanity check on the maximum value of the ramp. If the regression's projected value at the maximum complexity of the current ramp is very slow (less than 20 fps), 1) reduce the maximum complexity by 20%, and 2) do not include the regression's change point in the change point estimator. That estimator is used as the midpoint of the next ramp, and including the change point from a poor regression can bake in the error. The controller already knows how to adjust for ramps that are too easy for the system. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Takes a preferred profile and gives that to Regression. (this._processData.calculateScore): With the ramp controller, take the profile of the ramp that was used the most when calculating the ramp's regression. That profile is what is used for the test's score. * Animometer/resources/statistics.js: (Regression.Utilities.createClass): Update to take an options object which can specify a profile to calculate with. Otherwise it will continue to use both and select the one with the lower error. (_calculateRegression): Fix an issue where we claim 0 error if the regression calculation fails due to divide-by-zero. Instead reject that regression calculation by giving it Number.MAX_VALUE. * Animometer/resources/strings.js: New strings for marking a regression as flat or slope. * Animometer/tests/resources/main.js: (RampController): Rename the thresholds for clarity. Add a threshold that, if exceeded, will lower the maximum complexity of the next ramp. (tune): Relax the cdf check to consider whether the interval definitely falls in the desired frame length threshold. (processSamples): Include the profile in the ramp. Update ramp controller test. Increase the length of the test to 30 seconds, and extend the interval to 120 ms during sampling. Improve the estimation of the ramp parameters. * Animometer/developer.html: Change default to 30 seconds, and don't show the progress bar by default. * Animometer/resources/runner/animometer.js: Change default to 30 seconds. * Animometer/tests/resources/main.js: A number of improvements to the ramp controller, in the order in which they appear in the patch: - With a longer test length use longer ramps with longer intervals to get more data at each complexity. Keep the 100 ms interval length during the ramp up phase since we don't need to spend more time there to find the right order of magnitude, but increase it during the ramps to 120 ms. - The ramp linearly interpolates the complexity to render based on its timestamp, but it would never sample the minimum complexity. Instead of lerping max to min complexity from time 0 to t where t is the ramp length, instead lerp from 0 to (t - intervalSampleLength) so that we can have at least one interval sample at the min complexity for that ramp. - Some regression calculations only come out with one line segment rather than the two we expect. This could be due to a noisy ramp or the ramp's range is too narrow. If that's the case, influence the minimum complexity of the next ramp towards the lowest bound of 1, so that we ensure that at least part of the ramp is covering a complexity range that the system can handle at full 60. - Remove an assignment to interpolatedFrameLength since that is never subsequently used. Update the format used to serialize the results for analysis. Each data point used to have named properties for fields like complexity and frame rate. In addition the serialized numbers had rounding errors that took up many characters. Update the format by introducing a new data container called SampleData, which contains a field map. The map maps a string to an array index. Each data point is an array, so, to get a stat, use the field map to get the array index into the data point. This allows future versions to track other data, and reduces the size of the output string by two-thirds. * Animometer/resources/extensions.js: (Utilities.toFixedNumber): Add convenience function that truncates the number to a fixed precision, and converts it back to a number. (SampleData): New class that contains sample data and a field map that maps properties to an array index. (get length): Number of data points. (addField): Add a field to the field map. (push): Add a data point. (sort): Sort the data. (slice): Return new SampleData object with sliced data. (forEach): Iterate over the data with provided function. (createDatum): (getFieldInDatum): Returns the data point associated with the field name by looking it up in the field map in the datum provided, which can be the datum object itself (an array) or an index into the data member variable. (setFieldInDatum): Sets the data point associated with the field name. (at): Returns the data point at the provided index. (toArray): Serializes the data where the field map serves as property names for each point. * Animometer/resources/debug-runner/graph.js: (updateGraphData): Remove unused _testData. Convert the data to the old array format for the graph to use, since the old format was much easier to work with when displaying the graphs. (onGraphTypeChanged): For some controllers, no alternative score or mean is provided. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: Refactor to use SampleData. Update JSON output to only go to 3 digits of precision for purposes of reducing the data size. * Animometer/resources/strings.js: Add new strings to put into the field maps. * Animometer/tests/resources/main.js: Refactor to use SampleData. * Animometer/developer.html: * Animometer/index.html: Restructure results table for both pages. Add charset attribute to tests.js include. * Animometer/resources/debug-runner/animometer.css: Clear out styles from release runner. * Animometer/resources/debug-runner/graph.js: (onGraphTypeChanged): Update score and mean if bootstrap results are available from the controller, since not all controllers do bootstrapping. * Animometer/resources/debug-runner/tests.js: Update header text. * Animometer/resources/runner/animometer.css: Include confidence interval in results. * Animometer/resources/runner/animometer.js: (ResultsTable._addHeader): Header contents can be HTML, so use innerHTML instead. (ResultsTable._addBody): Add tbody element. (ResultsTable._addTest): Allow a data cell to invoke a JS function to get its contents. (window.benchmarkController.showResults): Add table that includes tests' confidence intervals. * Animometer/resources/runner/tests.js: (Headers.details.text): Add new details table that includes bootstrap confidence interval. The interval can be asymmetric, but for simplicity, report the maximum deviation percentage on either side of the bootstrap median. * Animometer/resources/statistics.js: (bootstrap): Include the confidence percentage in the return object. Report canvas size in results. * Animometer/developer.html: Add markup to indicate whether a small, medium, or large canvas was used. * Animometer/index.html: Ditto. * Animometer/resources/debug-runner/animometer.js: Call determineCanvasSize(). * Animometer/resources/runner/animometer.css: Update styles to set the canvas based on the body class size. * Animometer/resources/runner/animometer.js: (window.benchmarkController.initialize): Update styles to set the canvas based on the body class size. (window.benchmarkController.determineCanvasSize): Run various media queries and set the body class based on the size of the device. * Animometer/developer.html: Refactor to include the main CSS file, and redo the layout so that it doesn't rely on flexbox. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: (updateDisplay): Since various parts of the script alter the body class, we can't replace the className directly. Instead, remove all display-based values and then add the one that was selected. * Animometer/resources/debug-runner/graph.js: (updateGraphData): To set the size of the graph, use window.innerHeight. * Animometer/resources/runner/animometer.js: (window.sectionsManager.showSection): Since various parts of the script alter the body class, we can't replace the className directly. Remove all of the section classes individually and then add the one desired. * Animometer/tests/resources/stage.css: Remove -apple-system as a font to use in the stage. Canonical link: https://commits.webkit.org/177081@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@202314 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-06-22 03:11:15 +00:00
<li><label><input name="controller" type="radio" value="fixed"> Keep at a fixed complexity</label></li>
<li><label><input name="controller" type="radio" value="adaptive"> Maintain target FPS</label></li>
</ul>
</li>
<li>
<label>Target frame rate: <input type="number" id="frame-rate" value="50"> FPS</label>
</li>
<li>
<h3>Time measurement method:</h3>
<ul>
<li><label><input name="time-measurement" type="radio" value="performance" checked> <code>performance.now()</code> (if available)</label></li>
<li><label><input name="time-measurement" type="radio" value="raf"> <code>requestAnimationFrame()</code> timestamp</label></li>
Improvements to Animometer benchmark https://bugs.webkit.org/show_bug.cgi?id=157738 Reviewed by Dean Jackson. Provisionally reviewed by Said Abou-Hallawa. Include confidence interval for the final score, and store the canvas size in the serialization so that it is accurately shown in results. * Animometer/developer.html: Add a "confidence" div. * Animometer/index.html: Ditto. Convert "mean" to "confidence". * Animometer/resources/debug-runner/animometer.js: Look at options, and if the configuration is included, update the body class based on it (similar to what we do when we first load the page). That way, if you drag-and-drop previous results in, that configuration is reflected in the dashboard. Show the full confidence interval. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Style update. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: (_processData): Propagate the confidence interval values out and calculate the lower and upper bounds. For now, shortcut the aggregate calculation, since we only go through one iteration. (this._processData.calculateScore): Propagate the confidence interval out to be next to the score. Depending on the controller these values are calculated differently. (this._processData._getResultsProperty): Convenience function. (this._processData.get score): Refactor. (this._processData.get scoreLowerBound): Get the aggregate lower bound. (this._processData.get scoreUpperBound): Get the aggregate upper bound. (window.sectionsManager.setSectionScore): (window.benchmarkController._startBenchmark): When the benchmark starts, note the canvas size and add it to options. That way it will be automatically be serialized. (window.benchmarkController.showResults): Include the maximum deviation percentage. * Animometer/resources/runner/lines.svg: Make the background line up with the skew. * Animometer/resources/runner/tests.js: (Headers.details.text): Refactor. * Animometer/resources/statistics.js: (Statistics.largestDeviationPercentage): Convenience function to calculate the deviation percentage on either end and return the largest deviation. * Animometer/resources/strings.js: Allow specifying a regression profile to use instead of taking the one with the lowest error. Address an issue in the focus test when the regression calculated ends up overestimating the change point, causing a cascade of tougher ramps. The reason behind this is that at max complexity of an initial ramp, the frame length is very high, and it influences the second segment of the piecewise regression strongly, causing it to be very steep. As a result, the first segment, expected to be flat, ends up covering a higher range of complexity. That makes the change point much higher than it should be. To avoid this, we will add a sanity check on the maximum value of the ramp. If the regression's projected value at the maximum complexity of the current ramp is very slow (less than 20 fps), 1) reduce the maximum complexity by 20%, and 2) do not include the regression's change point in the change point estimator. That estimator is used as the midpoint of the next ramp, and including the change point from a poor regression can bake in the error. The controller already knows how to adjust for ramps that are too easy for the system. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Takes a preferred profile and gives that to Regression. (this._processData.calculateScore): With the ramp controller, take the profile of the ramp that was used the most when calculating the ramp's regression. That profile is what is used for the test's score. * Animometer/resources/statistics.js: (Regression.Utilities.createClass): Update to take an options object which can specify a profile to calculate with. Otherwise it will continue to use both and select the one with the lower error. (_calculateRegression): Fix an issue where we claim 0 error if the regression calculation fails due to divide-by-zero. Instead reject that regression calculation by giving it Number.MAX_VALUE. * Animometer/resources/strings.js: New strings for marking a regression as flat or slope. * Animometer/tests/resources/main.js: (RampController): Rename the thresholds for clarity. Add a threshold that, if exceeded, will lower the maximum complexity of the next ramp. (tune): Relax the cdf check to consider whether the interval definitely falls in the desired frame length threshold. (processSamples): Include the profile in the ramp. Update ramp controller test. Increase the length of the test to 30 seconds, and extend the interval to 120 ms during sampling. Improve the estimation of the ramp parameters. * Animometer/developer.html: Change default to 30 seconds, and don't show the progress bar by default. * Animometer/resources/runner/animometer.js: Change default to 30 seconds. * Animometer/tests/resources/main.js: A number of improvements to the ramp controller, in the order in which they appear in the patch: - With a longer test length use longer ramps with longer intervals to get more data at each complexity. Keep the 100 ms interval length during the ramp up phase since we don't need to spend more time there to find the right order of magnitude, but increase it during the ramps to 120 ms. - The ramp linearly interpolates the complexity to render based on its timestamp, but it would never sample the minimum complexity. Instead of lerping max to min complexity from time 0 to t where t is the ramp length, instead lerp from 0 to (t - intervalSampleLength) so that we can have at least one interval sample at the min complexity for that ramp. - Some regression calculations only come out with one line segment rather than the two we expect. This could be due to a noisy ramp or the ramp's range is too narrow. If that's the case, influence the minimum complexity of the next ramp towards the lowest bound of 1, so that we ensure that at least part of the ramp is covering a complexity range that the system can handle at full 60. - Remove an assignment to interpolatedFrameLength since that is never subsequently used. Update the format used to serialize the results for analysis. Each data point used to have named properties for fields like complexity and frame rate. In addition the serialized numbers had rounding errors that took up many characters. Update the format by introducing a new data container called SampleData, which contains a field map. The map maps a string to an array index. Each data point is an array, so, to get a stat, use the field map to get the array index into the data point. This allows future versions to track other data, and reduces the size of the output string by two-thirds. * Animometer/resources/extensions.js: (Utilities.toFixedNumber): Add convenience function that truncates the number to a fixed precision, and converts it back to a number. (SampleData): New class that contains sample data and a field map that maps properties to an array index. (get length): Number of data points. (addField): Add a field to the field map. (push): Add a data point. (sort): Sort the data. (slice): Return new SampleData object with sliced data. (forEach): Iterate over the data with provided function. (createDatum): (getFieldInDatum): Returns the data point associated with the field name by looking it up in the field map in the datum provided, which can be the datum object itself (an array) or an index into the data member variable. (setFieldInDatum): Sets the data point associated with the field name. (at): Returns the data point at the provided index. (toArray): Serializes the data where the field map serves as property names for each point. * Animometer/resources/debug-runner/graph.js: (updateGraphData): Remove unused _testData. Convert the data to the old array format for the graph to use, since the old format was much easier to work with when displaying the graphs. (onGraphTypeChanged): For some controllers, no alternative score or mean is provided. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: Refactor to use SampleData. Update JSON output to only go to 3 digits of precision for purposes of reducing the data size. * Animometer/resources/strings.js: Add new strings to put into the field maps. * Animometer/tests/resources/main.js: Refactor to use SampleData. * Animometer/developer.html: * Animometer/index.html: Restructure results table for both pages. Add charset attribute to tests.js include. * Animometer/resources/debug-runner/animometer.css: Clear out styles from release runner. * Animometer/resources/debug-runner/graph.js: (onGraphTypeChanged): Update score and mean if bootstrap results are available from the controller, since not all controllers do bootstrapping. * Animometer/resources/debug-runner/tests.js: Update header text. * Animometer/resources/runner/animometer.css: Include confidence interval in results. * Animometer/resources/runner/animometer.js: (ResultsTable._addHeader): Header contents can be HTML, so use innerHTML instead. (ResultsTable._addBody): Add tbody element. (ResultsTable._addTest): Allow a data cell to invoke a JS function to get its contents. (window.benchmarkController.showResults): Add table that includes tests' confidence intervals. * Animometer/resources/runner/tests.js: (Headers.details.text): Add new details table that includes bootstrap confidence interval. The interval can be asymmetric, but for simplicity, report the maximum deviation percentage on either side of the bootstrap median. * Animometer/resources/statistics.js: (bootstrap): Include the confidence percentage in the return object. Report canvas size in results. * Animometer/developer.html: Add markup to indicate whether a small, medium, or large canvas was used. * Animometer/index.html: Ditto. * Animometer/resources/debug-runner/animometer.js: Call determineCanvasSize(). * Animometer/resources/runner/animometer.css: Update styles to set the canvas based on the body class size. * Animometer/resources/runner/animometer.js: (window.benchmarkController.initialize): Update styles to set the canvas based on the body class size. (window.benchmarkController.determineCanvasSize): Run various media queries and set the body class based on the size of the device. * Animometer/developer.html: Refactor to include the main CSS file, and redo the layout so that it doesn't rely on flexbox. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: (updateDisplay): Since various parts of the script alter the body class, we can't replace the className directly. Instead, remove all display-based values and then add the one that was selected. * Animometer/resources/debug-runner/graph.js: (updateGraphData): To set the size of the graph, use window.innerHeight. * Animometer/resources/runner/animometer.js: (window.sectionsManager.showSection): Since various parts of the script alter the body class, we can't replace the className directly. Remove all of the section classes individually and then add the one desired. * Animometer/tests/resources/stage.css: Remove -apple-system as a font to use in the stage. Canonical link: https://commits.webkit.org/177081@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@202314 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-06-22 03:11:15 +00:00
<li><label><input name="time-measurement" type="radio" value="date"> <code>Date.now()</code></label></li>
</ul>
</li>
</ul>
</form>
</div>
Split benchmark into two different pages https://bugs.webkit.org/show_bug.cgi?id=152458 Reviewed by Simon Fraser. Address comments. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): When the testing is complete the frame owning the sampler goes away, and a later call to get the JSON data is no longer available. Process the data right here, instead, and just reference it when displaying the results in ResultsDashboard.prototype._processData. * Animometer/resources/extensions.js: (Array.prototype.fill.Array.prototype.fill): Add a null check. Remove braces around single-line clause. (Array.prototype.find.Array.prototype.find): Update the null check. (ResultsDashboard.prototype._processData): Use the already-processed data. * Animometer/resources/runner/animometer.css: (.frame-container > iframe): Remove calc(). Move Array functions to extensions.js since that is included by the harness. Add ES6 Array polyfills. * Animometer/resources/algorithm.js: (Array.prototype.swap): Moved to extensions.js. * Animometer/resources/extensions.js: (Array.prototype.swap): (Array.prototype.fill): Added. (Array.prototype.find): Added. Adjust styles for iPad. * Animometer/resources/runner/animometer.css: (@media screen and (min-device-width: 768px)): Apply to iPad as well. (@media screen and (max-device-width: 1024px)): Update width for iPads. Adjustment styles for iOS. * Animometer/developer.html: Different divs contain the iframe, so use a class instead and update the style rules. * Animometer/index.html: * Animometer/resources/debug-runner/animometer.css: Remove extraneous rules. (@media screen and (min-device-width: 1800px)): Move this up. * Animometer/resources/runner/animometer.css: Add rules to accomodate iOS. Get rid of prefixed flex properties for now. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/runner/animometer.css: Update the structure of the harness. Remove the JSON-per-test but keep the JSON of the whole test run. Use the full page in order to display the graph. * Animometer/developer.html: Update several of the JS file includes to UTF-8. Remove header and footer. Test results screen includes score, average, and worst 5% statistics. * Animometer/index.html: Make structure similar to developer.html. * Animometer/resources/debug-runner/animometer.css: Remove most of the button rules since they are superfluous. Move the progress bar to the top, fixed. Update the results page rules. * Animometer/resources/debug-runner/animometer.js: Remove most of the additions to sectionsManager since they are no longer needed. (setSectionHeader): Updates header of the section. (window.suitesManager._updateStartButtonState): Update selector. (showResults): Add keypress event for selecting different data for copy/paste. Update how the results are populated. Include full test JSON in a textarea, rather than requiring a button press. (showTestGraph): * Animometer/resources/debug-runner/tests.js: Update structure of Headers. Define different kinds of headers. Headers can control their title, and the text used as the cell contents, including class name. * Animometer/resources/extensions.js: (ResultsTable): Update to include a flattened version of the headers, used while populating table contents. Remove unneeded helper functions for creating the table. Rename "show" to "add". * Animometer/resources/runner/animometer.css: Update rules to accommodate the new structure. * Animometer/resources/runner/animometer.js: (window.sectionsManager.setSectionScore): Helper function to set the score and mean for a section. (window.sectionsManager.populateTable): Helper function to set the table. (window.benchmarkController.showResults): Refactor. (window.benchmarkController.selectResults): Update selectors. * Animometer/resources/runner/tests.js: Set Headers. Debug harness extends it. Update debug runner to have similar names to the basic runner. Include that page's CSS and remove extraneous CSS rules. Get rid of the statistics table #record. * Animometer/developer.html: Rename #home to #intro. Rename .spacer to hr. * Animometer/resources/debug-runner/animometer.css: Set to flexbox when selected. * Animometer/resources/debug-runner/animometer.js: Remove recordTable. (window.suitesManager._updateStartButtonState): Update selector to #intro. (setupRunningSectionStyle): Deleted. * Animometer/resources/runner/animometer.css: (#test-container.selected): Change to flex-box only when visible. Remove recordTable. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): * Animometer/resources/runner/tests.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * Animometer/tests/examples/resources/canvas-electrons.js: * Animometer/tests/examples/resources/canvas-stars.js: * Animometer/tests/misc/resources/compositing-transforms.js: * Animometer/tests/resources/main.js: * Animometer/tests/resources/stage.js: (StageBenchmark): Remove _recordTable. * Animometer/tests/simple/resources/simple-canvas-paths.js: * Animometer/tests/simple/resources/simple-canvas.js: * Animometer/tests/template/resources/template-canvas.js: * Animometer/tests/template/resources/template-css.js: * Animometer/tests/template/resources/template-svg.js: * Animometer/tests/text/resources/layering-text.js: * Animometer/resources/debug-runner/animometer.js: (willStartFirstIteration): Fix selector, since results-table is used in multiple places, so it cannot be an id. Make it possible to select the scores, or the whole table data, by cycling through different selections through key press of 's'. * Animometer/resources/runner/animometer.js: (window.benchmarkController.showResults): Attach a keypress handler if it hasn't been added already. (window.benchmarkController.selectResults): * Animometer/resources/runner/tests.js: Cycle through different ranges. Fix a few fly-by errors. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunnerState.prototype.prepareCurrentTest): Update the frame relative path since the files are now in the top directory instead of inside runner/. (BenchmarkRunner.prototype._runTestAndRecordResults): Incorrect reference to function. (BenchmarkRunner.prototype.step): Member variable is never used. A little stylistic cleanup. * Animometer/resources/debug-runner/benchmark-runner.js: * Animometer/resources/extensions.js: (window.DocumentExtension.createElement): * Animometer/tests/resources/main.js: (Benchmark.prototype.record): * Animometer/tests/resources/stage.js: (StageBenchmark.prototype.showResults): Reverse progress and message. The message appears less frequently than the progress. * Animometer/tests/simple/resources/simple-canvas.js: (SimpleCanvasBenchmark): Remove unused options. Add newer version of harness in a new page. Consolidate differences between the two pages. * Animometer/developer.html: Include runner/animometer.js. Rename the JS function to run the benchmark to startBenchmark() instead of startTest(). Rename #running to #test-container. * Animometer/index.html: Added. Similarly calls startBenchmark() and has #test-container. * Animometer/resources/debug-runner/animometer.css: Make the canvas 2:1 (1200px x 800px) instead of 4:3. Split out benchmarkRunnerClient and benchmarkController. * Animometer/resources/debug-runner/animometer.js: Move needed functions out of benchmarkRunnerClient, and leave the rest here to extend that object. Get rid of _resultsTable and move populating the results table into benchmarkController. Rename _resultsDashboard to results and make it accessible for other objects to use. (willAddTestFrame): This is unnecessary. Remove. (window.sectionsManager.showScore): Grab it from the results object instead of benchmarkRunnerClient. (window.sectionsManager.showSection): Deleted. Moved to runner/animometer.js. (window.benchmarkController._runBenchmark): Deleted. Mostly moved into _startBenchmark. (window.benchmarkController.startBenchmark): Refactor to call _startBenchmark. (window.benchmarkController.showResults): Include most of benchmarkRunnerClient.didFinishLastIteration() here. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._appendFrame): Remove unneeded call to willAddTestFrame. * Animometer/resources/extensions.js: (ResultsDashboard): Change the class to process the sampler data on-demand and hold onto that data for later referencing. (ResultsDashboard.prototype.toJSON): Deleted. (ResultsDashboard.prototype._processData): Rename toJSON to _processData since it's not really outputting JSON. Store the processed data into a member variable that can be referenced later. (ResultsDashboard.prototype.get data): Process the data if it hasn't already. (ResultsDashboard.prototype.get score): Process the data if it hasn't already, then return the aggregate score. (ResultsTable.prototype._showHeader): When outputting the results to a table, don't force the need for an empty children array. This was to allow for a header row in the table that spanned multiple columns. In the simpler harness, this is not needed. (ResultsTable.prototype._showEmptyCells): (ResultsTable.prototype._showTest): This hardcoded the columns. At least for the name and score, which is the bare minimum needed for the simpler harness, key off of the header name provided. * Animometer/resources/runner/animometer.css: Added. Use a similar 2:1 ratio. The score tables are split into the data and the headers, and are also displayed RTL so that a later patch allows a user to copy-paste the data easily. * Animometer/resources/runner/animometer.js: Added. Use a simpler version of benchmarkRunnerClient. The debug harness will extend these classes. (window.benchmarkController._startBenchmark): Used by both harnesses. (window.benchmarkController.startBenchmark): Set hard-coded options. (window.benchmarkController.showResults): Includes most of benchmarkRunnerClient.didFinishLastIteration() here. Get rid of utilities.js. Move it all into extensions.js. * Animometer/resources/extensions.js: * Animometer/tests/resources/utilities.js: Removed. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: Remove script link. * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: Ditto. * Animometer/tests/examples/canvas-electrons.html: Ditto. * Animometer/tests/examples/canvas-stars.html: Ditto. * Animometer/tests/misc/compositing-transforms.html: Ditto. * Animometer/tests/simple/simple-canvas-paths.html: Ditto. * Animometer/tests/template/template-canvas.html: Ditto. * Animometer/tests/template/template-css.html: Ditto. * Animometer/tests/template/template-svg.html: Ditto. * Animometer/tests/text/layering-text.html: Ditto. Split tests.js into two. Add a new suite to runner/tests.js. * Animometer/developer.html: Update the script order. Scripts from debug-runner/ will always build on those from runner/, and have the same name. * Animometer/resources/debug-runner/tests.js: Move "complex examples" suite into "miscellaneous tests". (Suite): Deleted. (Suite.prototype.prepare): Deleted. (Suite.prototype.run): Deleted. (suiteFromName): Deleted. * Animometer/resources/runner/tests.js: Added. Take definitions and functions needed by the test harness. Leave the test suites behind. (Suite): Moved from debug script. (Suite.prototype.prepare): Ditto. (Suite.prototype.run): Ditto. (suiteFromName): Ditto. (testFromName): Ditto. Move benchmark resources out into resources/debug-runner, and update URLs. * Animometer/developer.html: Renamed from PerformanceTests/Animometer/runner/animometer.html. * Animometer/resources/debug-runner/animometer.css: Renamed from PerformanceTests/Animometer/runner/resources/animometer.css. * Animometer/resources/debug-runner/animometer.js: Renamed from PerformanceTests/Animometer/runner/resources/animometer.js. * Animometer/resources/debug-runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/runner/resources/benchmark-runner.js. * Animometer/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/runner/resources/d3.min.js. * Animometer/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/runner/resources/graph.js. * Animometer/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/runner/resources/tests.js. Canonical link: https://commits.webkit.org/170645@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@194407 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2015-12-24 01:31:47 +00:00
</div>
Improvements to Animometer benchmark https://bugs.webkit.org/show_bug.cgi?id=157738 Reviewed by Dean Jackson. Provisionally reviewed by Said Abou-Hallawa. Include confidence interval for the final score, and store the canvas size in the serialization so that it is accurately shown in results. * Animometer/developer.html: Add a "confidence" div. * Animometer/index.html: Ditto. Convert "mean" to "confidence". * Animometer/resources/debug-runner/animometer.js: Look at options, and if the configuration is included, update the body class based on it (similar to what we do when we first load the page). That way, if you drag-and-drop previous results in, that configuration is reflected in the dashboard. Show the full confidence interval. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Style update. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: (_processData): Propagate the confidence interval values out and calculate the lower and upper bounds. For now, shortcut the aggregate calculation, since we only go through one iteration. (this._processData.calculateScore): Propagate the confidence interval out to be next to the score. Depending on the controller these values are calculated differently. (this._processData._getResultsProperty): Convenience function. (this._processData.get score): Refactor. (this._processData.get scoreLowerBound): Get the aggregate lower bound. (this._processData.get scoreUpperBound): Get the aggregate upper bound. (window.sectionsManager.setSectionScore): (window.benchmarkController._startBenchmark): When the benchmark starts, note the canvas size and add it to options. That way it will be automatically be serialized. (window.benchmarkController.showResults): Include the maximum deviation percentage. * Animometer/resources/runner/lines.svg: Make the background line up with the skew. * Animometer/resources/runner/tests.js: (Headers.details.text): Refactor. * Animometer/resources/statistics.js: (Statistics.largestDeviationPercentage): Convenience function to calculate the deviation percentage on either end and return the largest deviation. * Animometer/resources/strings.js: Allow specifying a regression profile to use instead of taking the one with the lowest error. Address an issue in the focus test when the regression calculated ends up overestimating the change point, causing a cascade of tougher ramps. The reason behind this is that at max complexity of an initial ramp, the frame length is very high, and it influences the second segment of the piecewise regression strongly, causing it to be very steep. As a result, the first segment, expected to be flat, ends up covering a higher range of complexity. That makes the change point much higher than it should be. To avoid this, we will add a sanity check on the maximum value of the ramp. If the regression's projected value at the maximum complexity of the current ramp is very slow (less than 20 fps), 1) reduce the maximum complexity by 20%, and 2) do not include the regression's change point in the change point estimator. That estimator is used as the midpoint of the next ramp, and including the change point from a poor regression can bake in the error. The controller already knows how to adjust for ramps that are too easy for the system. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Takes a preferred profile and gives that to Regression. (this._processData.calculateScore): With the ramp controller, take the profile of the ramp that was used the most when calculating the ramp's regression. That profile is what is used for the test's score. * Animometer/resources/statistics.js: (Regression.Utilities.createClass): Update to take an options object which can specify a profile to calculate with. Otherwise it will continue to use both and select the one with the lower error. (_calculateRegression): Fix an issue where we claim 0 error if the regression calculation fails due to divide-by-zero. Instead reject that regression calculation by giving it Number.MAX_VALUE. * Animometer/resources/strings.js: New strings for marking a regression as flat or slope. * Animometer/tests/resources/main.js: (RampController): Rename the thresholds for clarity. Add a threshold that, if exceeded, will lower the maximum complexity of the next ramp. (tune): Relax the cdf check to consider whether the interval definitely falls in the desired frame length threshold. (processSamples): Include the profile in the ramp. Update ramp controller test. Increase the length of the test to 30 seconds, and extend the interval to 120 ms during sampling. Improve the estimation of the ramp parameters. * Animometer/developer.html: Change default to 30 seconds, and don't show the progress bar by default. * Animometer/resources/runner/animometer.js: Change default to 30 seconds. * Animometer/tests/resources/main.js: A number of improvements to the ramp controller, in the order in which they appear in the patch: - With a longer test length use longer ramps with longer intervals to get more data at each complexity. Keep the 100 ms interval length during the ramp up phase since we don't need to spend more time there to find the right order of magnitude, but increase it during the ramps to 120 ms. - The ramp linearly interpolates the complexity to render based on its timestamp, but it would never sample the minimum complexity. Instead of lerping max to min complexity from time 0 to t where t is the ramp length, instead lerp from 0 to (t - intervalSampleLength) so that we can have at least one interval sample at the min complexity for that ramp. - Some regression calculations only come out with one line segment rather than the two we expect. This could be due to a noisy ramp or the ramp's range is too narrow. If that's the case, influence the minimum complexity of the next ramp towards the lowest bound of 1, so that we ensure that at least part of the ramp is covering a complexity range that the system can handle at full 60. - Remove an assignment to interpolatedFrameLength since that is never subsequently used. Update the format used to serialize the results for analysis. Each data point used to have named properties for fields like complexity and frame rate. In addition the serialized numbers had rounding errors that took up many characters. Update the format by introducing a new data container called SampleData, which contains a field map. The map maps a string to an array index. Each data point is an array, so, to get a stat, use the field map to get the array index into the data point. This allows future versions to track other data, and reduces the size of the output string by two-thirds. * Animometer/resources/extensions.js: (Utilities.toFixedNumber): Add convenience function that truncates the number to a fixed precision, and converts it back to a number. (SampleData): New class that contains sample data and a field map that maps properties to an array index. (get length): Number of data points. (addField): Add a field to the field map. (push): Add a data point. (sort): Sort the data. (slice): Return new SampleData object with sliced data. (forEach): Iterate over the data with provided function. (createDatum): (getFieldInDatum): Returns the data point associated with the field name by looking it up in the field map in the datum provided, which can be the datum object itself (an array) or an index into the data member variable. (setFieldInDatum): Sets the data point associated with the field name. (at): Returns the data point at the provided index. (toArray): Serializes the data where the field map serves as property names for each point. * Animometer/resources/debug-runner/graph.js: (updateGraphData): Remove unused _testData. Convert the data to the old array format for the graph to use, since the old format was much easier to work with when displaying the graphs. (onGraphTypeChanged): For some controllers, no alternative score or mean is provided. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: Refactor to use SampleData. Update JSON output to only go to 3 digits of precision for purposes of reducing the data size. * Animometer/resources/strings.js: Add new strings to put into the field maps. * Animometer/tests/resources/main.js: Refactor to use SampleData. * Animometer/developer.html: * Animometer/index.html: Restructure results table for both pages. Add charset attribute to tests.js include. * Animometer/resources/debug-runner/animometer.css: Clear out styles from release runner. * Animometer/resources/debug-runner/graph.js: (onGraphTypeChanged): Update score and mean if bootstrap results are available from the controller, since not all controllers do bootstrapping. * Animometer/resources/debug-runner/tests.js: Update header text. * Animometer/resources/runner/animometer.css: Include confidence interval in results. * Animometer/resources/runner/animometer.js: (ResultsTable._addHeader): Header contents can be HTML, so use innerHTML instead. (ResultsTable._addBody): Add tbody element. (ResultsTable._addTest): Allow a data cell to invoke a JS function to get its contents. (window.benchmarkController.showResults): Add table that includes tests' confidence intervals. * Animometer/resources/runner/tests.js: (Headers.details.text): Add new details table that includes bootstrap confidence interval. The interval can be asymmetric, but for simplicity, report the maximum deviation percentage on either side of the bootstrap median. * Animometer/resources/statistics.js: (bootstrap): Include the confidence percentage in the return object. Report canvas size in results. * Animometer/developer.html: Add markup to indicate whether a small, medium, or large canvas was used. * Animometer/index.html: Ditto. * Animometer/resources/debug-runner/animometer.js: Call determineCanvasSize(). * Animometer/resources/runner/animometer.css: Update styles to set the canvas based on the body class size. * Animometer/resources/runner/animometer.js: (window.benchmarkController.initialize): Update styles to set the canvas based on the body class size. (window.benchmarkController.determineCanvasSize): Run various media queries and set the body class based on the size of the device. * Animometer/developer.html: Refactor to include the main CSS file, and redo the layout so that it doesn't rely on flexbox. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: (updateDisplay): Since various parts of the script alter the body class, we can't replace the className directly. Instead, remove all display-based values and then add the one that was selected. * Animometer/resources/debug-runner/graph.js: (updateGraphData): To set the size of the graph, use window.innerHeight. * Animometer/resources/runner/animometer.js: (window.sectionsManager.showSection): Since various parts of the script alter the body class, we can't replace the className directly. Remove all of the section classes individually and then add the one desired. * Animometer/tests/resources/stage.css: Remove -apple-system as a font to use in the stage. Canonical link: https://commits.webkit.org/177081@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@202314 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-06-22 03:11:15 +00:00
<p>For accurate results, please take the browser window full screen, or rotate the device to landscape orientation.</p>
<div class="start-benchmark">
<p class="hidden">Please rotate the device to orientation before starting.</p>
<button id="run-benchmark" onclick="benchmarkController.startBenchmark()">Run benchmark</button>
Split benchmark into two different pages https://bugs.webkit.org/show_bug.cgi?id=152458 Reviewed by Simon Fraser. Address comments. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): When the testing is complete the frame owning the sampler goes away, and a later call to get the JSON data is no longer available. Process the data right here, instead, and just reference it when displaying the results in ResultsDashboard.prototype._processData. * Animometer/resources/extensions.js: (Array.prototype.fill.Array.prototype.fill): Add a null check. Remove braces around single-line clause. (Array.prototype.find.Array.prototype.find): Update the null check. (ResultsDashboard.prototype._processData): Use the already-processed data. * Animometer/resources/runner/animometer.css: (.frame-container > iframe): Remove calc(). Move Array functions to extensions.js since that is included by the harness. Add ES6 Array polyfills. * Animometer/resources/algorithm.js: (Array.prototype.swap): Moved to extensions.js. * Animometer/resources/extensions.js: (Array.prototype.swap): (Array.prototype.fill): Added. (Array.prototype.find): Added. Adjust styles for iPad. * Animometer/resources/runner/animometer.css: (@media screen and (min-device-width: 768px)): Apply to iPad as well. (@media screen and (max-device-width: 1024px)): Update width for iPads. Adjustment styles for iOS. * Animometer/developer.html: Different divs contain the iframe, so use a class instead and update the style rules. * Animometer/index.html: * Animometer/resources/debug-runner/animometer.css: Remove extraneous rules. (@media screen and (min-device-width: 1800px)): Move this up. * Animometer/resources/runner/animometer.css: Add rules to accomodate iOS. Get rid of prefixed flex properties for now. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/runner/animometer.css: Update the structure of the harness. Remove the JSON-per-test but keep the JSON of the whole test run. Use the full page in order to display the graph. * Animometer/developer.html: Update several of the JS file includes to UTF-8. Remove header and footer. Test results screen includes score, average, and worst 5% statistics. * Animometer/index.html: Make structure similar to developer.html. * Animometer/resources/debug-runner/animometer.css: Remove most of the button rules since they are superfluous. Move the progress bar to the top, fixed. Update the results page rules. * Animometer/resources/debug-runner/animometer.js: Remove most of the additions to sectionsManager since they are no longer needed. (setSectionHeader): Updates header of the section. (window.suitesManager._updateStartButtonState): Update selector. (showResults): Add keypress event for selecting different data for copy/paste. Update how the results are populated. Include full test JSON in a textarea, rather than requiring a button press. (showTestGraph): * Animometer/resources/debug-runner/tests.js: Update structure of Headers. Define different kinds of headers. Headers can control their title, and the text used as the cell contents, including class name. * Animometer/resources/extensions.js: (ResultsTable): Update to include a flattened version of the headers, used while populating table contents. Remove unneeded helper functions for creating the table. Rename "show" to "add". * Animometer/resources/runner/animometer.css: Update rules to accommodate the new structure. * Animometer/resources/runner/animometer.js: (window.sectionsManager.setSectionScore): Helper function to set the score and mean for a section. (window.sectionsManager.populateTable): Helper function to set the table. (window.benchmarkController.showResults): Refactor. (window.benchmarkController.selectResults): Update selectors. * Animometer/resources/runner/tests.js: Set Headers. Debug harness extends it. Update debug runner to have similar names to the basic runner. Include that page's CSS and remove extraneous CSS rules. Get rid of the statistics table #record. * Animometer/developer.html: Rename #home to #intro. Rename .spacer to hr. * Animometer/resources/debug-runner/animometer.css: Set to flexbox when selected. * Animometer/resources/debug-runner/animometer.js: Remove recordTable. (window.suitesManager._updateStartButtonState): Update selector to #intro. (setupRunningSectionStyle): Deleted. * Animometer/resources/runner/animometer.css: (#test-container.selected): Change to flex-box only when visible. Remove recordTable. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): * Animometer/resources/runner/tests.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * Animometer/tests/examples/resources/canvas-electrons.js: * Animometer/tests/examples/resources/canvas-stars.js: * Animometer/tests/misc/resources/compositing-transforms.js: * Animometer/tests/resources/main.js: * Animometer/tests/resources/stage.js: (StageBenchmark): Remove _recordTable. * Animometer/tests/simple/resources/simple-canvas-paths.js: * Animometer/tests/simple/resources/simple-canvas.js: * Animometer/tests/template/resources/template-canvas.js: * Animometer/tests/template/resources/template-css.js: * Animometer/tests/template/resources/template-svg.js: * Animometer/tests/text/resources/layering-text.js: * Animometer/resources/debug-runner/animometer.js: (willStartFirstIteration): Fix selector, since results-table is used in multiple places, so it cannot be an id. Make it possible to select the scores, or the whole table data, by cycling through different selections through key press of 's'. * Animometer/resources/runner/animometer.js: (window.benchmarkController.showResults): Attach a keypress handler if it hasn't been added already. (window.benchmarkController.selectResults): * Animometer/resources/runner/tests.js: Cycle through different ranges. Fix a few fly-by errors. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunnerState.prototype.prepareCurrentTest): Update the frame relative path since the files are now in the top directory instead of inside runner/. (BenchmarkRunner.prototype._runTestAndRecordResults): Incorrect reference to function. (BenchmarkRunner.prototype.step): Member variable is never used. A little stylistic cleanup. * Animometer/resources/debug-runner/benchmark-runner.js: * Animometer/resources/extensions.js: (window.DocumentExtension.createElement): * Animometer/tests/resources/main.js: (Benchmark.prototype.record): * Animometer/tests/resources/stage.js: (StageBenchmark.prototype.showResults): Reverse progress and message. The message appears less frequently than the progress. * Animometer/tests/simple/resources/simple-canvas.js: (SimpleCanvasBenchmark): Remove unused options. Add newer version of harness in a new page. Consolidate differences between the two pages. * Animometer/developer.html: Include runner/animometer.js. Rename the JS function to run the benchmark to startBenchmark() instead of startTest(). Rename #running to #test-container. * Animometer/index.html: Added. Similarly calls startBenchmark() and has #test-container. * Animometer/resources/debug-runner/animometer.css: Make the canvas 2:1 (1200px x 800px) instead of 4:3. Split out benchmarkRunnerClient and benchmarkController. * Animometer/resources/debug-runner/animometer.js: Move needed functions out of benchmarkRunnerClient, and leave the rest here to extend that object. Get rid of _resultsTable and move populating the results table into benchmarkController. Rename _resultsDashboard to results and make it accessible for other objects to use. (willAddTestFrame): This is unnecessary. Remove. (window.sectionsManager.showScore): Grab it from the results object instead of benchmarkRunnerClient. (window.sectionsManager.showSection): Deleted. Moved to runner/animometer.js. (window.benchmarkController._runBenchmark): Deleted. Mostly moved into _startBenchmark. (window.benchmarkController.startBenchmark): Refactor to call _startBenchmark. (window.benchmarkController.showResults): Include most of benchmarkRunnerClient.didFinishLastIteration() here. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._appendFrame): Remove unneeded call to willAddTestFrame. * Animometer/resources/extensions.js: (ResultsDashboard): Change the class to process the sampler data on-demand and hold onto that data for later referencing. (ResultsDashboard.prototype.toJSON): Deleted. (ResultsDashboard.prototype._processData): Rename toJSON to _processData since it's not really outputting JSON. Store the processed data into a member variable that can be referenced later. (ResultsDashboard.prototype.get data): Process the data if it hasn't already. (ResultsDashboard.prototype.get score): Process the data if it hasn't already, then return the aggregate score. (ResultsTable.prototype._showHeader): When outputting the results to a table, don't force the need for an empty children array. This was to allow for a header row in the table that spanned multiple columns. In the simpler harness, this is not needed. (ResultsTable.prototype._showEmptyCells): (ResultsTable.prototype._showTest): This hardcoded the columns. At least for the name and score, which is the bare minimum needed for the simpler harness, key off of the header name provided. * Animometer/resources/runner/animometer.css: Added. Use a similar 2:1 ratio. The score tables are split into the data and the headers, and are also displayed RTL so that a later patch allows a user to copy-paste the data easily. * Animometer/resources/runner/animometer.js: Added. Use a simpler version of benchmarkRunnerClient. The debug harness will extend these classes. (window.benchmarkController._startBenchmark): Used by both harnesses. (window.benchmarkController.startBenchmark): Set hard-coded options. (window.benchmarkController.showResults): Includes most of benchmarkRunnerClient.didFinishLastIteration() here. Get rid of utilities.js. Move it all into extensions.js. * Animometer/resources/extensions.js: * Animometer/tests/resources/utilities.js: Removed. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: Remove script link. * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: Ditto. * Animometer/tests/examples/canvas-electrons.html: Ditto. * Animometer/tests/examples/canvas-stars.html: Ditto. * Animometer/tests/misc/compositing-transforms.html: Ditto. * Animometer/tests/simple/simple-canvas-paths.html: Ditto. * Animometer/tests/template/template-canvas.html: Ditto. * Animometer/tests/template/template-css.html: Ditto. * Animometer/tests/template/template-svg.html: Ditto. * Animometer/tests/text/layering-text.html: Ditto. Split tests.js into two. Add a new suite to runner/tests.js. * Animometer/developer.html: Update the script order. Scripts from debug-runner/ will always build on those from runner/, and have the same name. * Animometer/resources/debug-runner/tests.js: Move "complex examples" suite into "miscellaneous tests". (Suite): Deleted. (Suite.prototype.prepare): Deleted. (Suite.prototype.run): Deleted. (suiteFromName): Deleted. * Animometer/resources/runner/tests.js: Added. Take definitions and functions needed by the test harness. Leave the test suites behind. (Suite): Moved from debug script. (Suite.prototype.prepare): Ditto. (Suite.prototype.run): Ditto. (suiteFromName): Ditto. (testFromName): Ditto. Move benchmark resources out into resources/debug-runner, and update URLs. * Animometer/developer.html: Renamed from PerformanceTests/Animometer/runner/animometer.html. * Animometer/resources/debug-runner/animometer.css: Renamed from PerformanceTests/Animometer/runner/resources/animometer.css. * Animometer/resources/debug-runner/animometer.js: Renamed from PerformanceTests/Animometer/runner/resources/animometer.js. * Animometer/resources/debug-runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/runner/resources/benchmark-runner.js. * Animometer/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/runner/resources/d3.min.js. * Animometer/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/runner/resources/graph.js. * Animometer/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/runner/resources/tests.js. Canonical link: https://commits.webkit.org/170645@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@194407 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2015-12-24 01:31:47 +00:00
</div>
</div>
</section>
Improvements to Animometer benchmark https://bugs.webkit.org/show_bug.cgi?id=157738 Reviewed by Dean Jackson. Provisionally reviewed by Said Abou-Hallawa. Include confidence interval for the final score, and store the canvas size in the serialization so that it is accurately shown in results. * Animometer/developer.html: Add a "confidence" div. * Animometer/index.html: Ditto. Convert "mean" to "confidence". * Animometer/resources/debug-runner/animometer.js: Look at options, and if the configuration is included, update the body class based on it (similar to what we do when we first load the page). That way, if you drag-and-drop previous results in, that configuration is reflected in the dashboard. Show the full confidence interval. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Style update. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: (_processData): Propagate the confidence interval values out and calculate the lower and upper bounds. For now, shortcut the aggregate calculation, since we only go through one iteration. (this._processData.calculateScore): Propagate the confidence interval out to be next to the score. Depending on the controller these values are calculated differently. (this._processData._getResultsProperty): Convenience function. (this._processData.get score): Refactor. (this._processData.get scoreLowerBound): Get the aggregate lower bound. (this._processData.get scoreUpperBound): Get the aggregate upper bound. (window.sectionsManager.setSectionScore): (window.benchmarkController._startBenchmark): When the benchmark starts, note the canvas size and add it to options. That way it will be automatically be serialized. (window.benchmarkController.showResults): Include the maximum deviation percentage. * Animometer/resources/runner/lines.svg: Make the background line up with the skew. * Animometer/resources/runner/tests.js: (Headers.details.text): Refactor. * Animometer/resources/statistics.js: (Statistics.largestDeviationPercentage): Convenience function to calculate the deviation percentage on either end and return the largest deviation. * Animometer/resources/strings.js: Allow specifying a regression profile to use instead of taking the one with the lowest error. Address an issue in the focus test when the regression calculated ends up overestimating the change point, causing a cascade of tougher ramps. The reason behind this is that at max complexity of an initial ramp, the frame length is very high, and it influences the second segment of the piecewise regression strongly, causing it to be very steep. As a result, the first segment, expected to be flat, ends up covering a higher range of complexity. That makes the change point much higher than it should be. To avoid this, we will add a sanity check on the maximum value of the ramp. If the regression's projected value at the maximum complexity of the current ramp is very slow (less than 20 fps), 1) reduce the maximum complexity by 20%, and 2) do not include the regression's change point in the change point estimator. That estimator is used as the midpoint of the next ramp, and including the change point from a poor regression can bake in the error. The controller already knows how to adjust for ramps that are too easy for the system. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Takes a preferred profile and gives that to Regression. (this._processData.calculateScore): With the ramp controller, take the profile of the ramp that was used the most when calculating the ramp's regression. That profile is what is used for the test's score. * Animometer/resources/statistics.js: (Regression.Utilities.createClass): Update to take an options object which can specify a profile to calculate with. Otherwise it will continue to use both and select the one with the lower error. (_calculateRegression): Fix an issue where we claim 0 error if the regression calculation fails due to divide-by-zero. Instead reject that regression calculation by giving it Number.MAX_VALUE. * Animometer/resources/strings.js: New strings for marking a regression as flat or slope. * Animometer/tests/resources/main.js: (RampController): Rename the thresholds for clarity. Add a threshold that, if exceeded, will lower the maximum complexity of the next ramp. (tune): Relax the cdf check to consider whether the interval definitely falls in the desired frame length threshold. (processSamples): Include the profile in the ramp. Update ramp controller test. Increase the length of the test to 30 seconds, and extend the interval to 120 ms during sampling. Improve the estimation of the ramp parameters. * Animometer/developer.html: Change default to 30 seconds, and don't show the progress bar by default. * Animometer/resources/runner/animometer.js: Change default to 30 seconds. * Animometer/tests/resources/main.js: A number of improvements to the ramp controller, in the order in which they appear in the patch: - With a longer test length use longer ramps with longer intervals to get more data at each complexity. Keep the 100 ms interval length during the ramp up phase since we don't need to spend more time there to find the right order of magnitude, but increase it during the ramps to 120 ms. - The ramp linearly interpolates the complexity to render based on its timestamp, but it would never sample the minimum complexity. Instead of lerping max to min complexity from time 0 to t where t is the ramp length, instead lerp from 0 to (t - intervalSampleLength) so that we can have at least one interval sample at the min complexity for that ramp. - Some regression calculations only come out with one line segment rather than the two we expect. This could be due to a noisy ramp or the ramp's range is too narrow. If that's the case, influence the minimum complexity of the next ramp towards the lowest bound of 1, so that we ensure that at least part of the ramp is covering a complexity range that the system can handle at full 60. - Remove an assignment to interpolatedFrameLength since that is never subsequently used. Update the format used to serialize the results for analysis. Each data point used to have named properties for fields like complexity and frame rate. In addition the serialized numbers had rounding errors that took up many characters. Update the format by introducing a new data container called SampleData, which contains a field map. The map maps a string to an array index. Each data point is an array, so, to get a stat, use the field map to get the array index into the data point. This allows future versions to track other data, and reduces the size of the output string by two-thirds. * Animometer/resources/extensions.js: (Utilities.toFixedNumber): Add convenience function that truncates the number to a fixed precision, and converts it back to a number. (SampleData): New class that contains sample data and a field map that maps properties to an array index. (get length): Number of data points. (addField): Add a field to the field map. (push): Add a data point. (sort): Sort the data. (slice): Return new SampleData object with sliced data. (forEach): Iterate over the data with provided function. (createDatum): (getFieldInDatum): Returns the data point associated with the field name by looking it up in the field map in the datum provided, which can be the datum object itself (an array) or an index into the data member variable. (setFieldInDatum): Sets the data point associated with the field name. (at): Returns the data point at the provided index. (toArray): Serializes the data where the field map serves as property names for each point. * Animometer/resources/debug-runner/graph.js: (updateGraphData): Remove unused _testData. Convert the data to the old array format for the graph to use, since the old format was much easier to work with when displaying the graphs. (onGraphTypeChanged): For some controllers, no alternative score or mean is provided. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: Refactor to use SampleData. Update JSON output to only go to 3 digits of precision for purposes of reducing the data size. * Animometer/resources/strings.js: Add new strings to put into the field maps. * Animometer/tests/resources/main.js: Refactor to use SampleData. * Animometer/developer.html: * Animometer/index.html: Restructure results table for both pages. Add charset attribute to tests.js include. * Animometer/resources/debug-runner/animometer.css: Clear out styles from release runner. * Animometer/resources/debug-runner/graph.js: (onGraphTypeChanged): Update score and mean if bootstrap results are available from the controller, since not all controllers do bootstrapping. * Animometer/resources/debug-runner/tests.js: Update header text. * Animometer/resources/runner/animometer.css: Include confidence interval in results. * Animometer/resources/runner/animometer.js: (ResultsTable._addHeader): Header contents can be HTML, so use innerHTML instead. (ResultsTable._addBody): Add tbody element. (ResultsTable._addTest): Allow a data cell to invoke a JS function to get its contents. (window.benchmarkController.showResults): Add table that includes tests' confidence intervals. * Animometer/resources/runner/tests.js: (Headers.details.text): Add new details table that includes bootstrap confidence interval. The interval can be asymmetric, but for simplicity, report the maximum deviation percentage on either side of the bootstrap median. * Animometer/resources/statistics.js: (bootstrap): Include the confidence percentage in the return object. Report canvas size in results. * Animometer/developer.html: Add markup to indicate whether a small, medium, or large canvas was used. * Animometer/index.html: Ditto. * Animometer/resources/debug-runner/animometer.js: Call determineCanvasSize(). * Animometer/resources/runner/animometer.css: Update styles to set the canvas based on the body class size. * Animometer/resources/runner/animometer.js: (window.benchmarkController.initialize): Update styles to set the canvas based on the body class size. (window.benchmarkController.determineCanvasSize): Run various media queries and set the body class based on the size of the device. * Animometer/developer.html: Refactor to include the main CSS file, and redo the layout so that it doesn't rely on flexbox. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: (updateDisplay): Since various parts of the script alter the body class, we can't replace the className directly. Instead, remove all display-based values and then add the one that was selected. * Animometer/resources/debug-runner/graph.js: (updateGraphData): To set the size of the graph, use window.innerHeight. * Animometer/resources/runner/animometer.js: (window.sectionsManager.showSection): Since various parts of the script alter the body class, we can't replace the className directly. Remove all of the section classes individually and then add the one desired. * Animometer/tests/resources/stage.css: Remove -apple-system as a font to use in the stage. Canonical link: https://commits.webkit.org/177081@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@202314 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-06-22 03:11:15 +00:00
Split benchmark into two different pages https://bugs.webkit.org/show_bug.cgi?id=152458 Reviewed by Simon Fraser. Address comments. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): When the testing is complete the frame owning the sampler goes away, and a later call to get the JSON data is no longer available. Process the data right here, instead, and just reference it when displaying the results in ResultsDashboard.prototype._processData. * Animometer/resources/extensions.js: (Array.prototype.fill.Array.prototype.fill): Add a null check. Remove braces around single-line clause. (Array.prototype.find.Array.prototype.find): Update the null check. (ResultsDashboard.prototype._processData): Use the already-processed data. * Animometer/resources/runner/animometer.css: (.frame-container > iframe): Remove calc(). Move Array functions to extensions.js since that is included by the harness. Add ES6 Array polyfills. * Animometer/resources/algorithm.js: (Array.prototype.swap): Moved to extensions.js. * Animometer/resources/extensions.js: (Array.prototype.swap): (Array.prototype.fill): Added. (Array.prototype.find): Added. Adjust styles for iPad. * Animometer/resources/runner/animometer.css: (@media screen and (min-device-width: 768px)): Apply to iPad as well. (@media screen and (max-device-width: 1024px)): Update width for iPads. Adjustment styles for iOS. * Animometer/developer.html: Different divs contain the iframe, so use a class instead and update the style rules. * Animometer/index.html: * Animometer/resources/debug-runner/animometer.css: Remove extraneous rules. (@media screen and (min-device-width: 1800px)): Move this up. * Animometer/resources/runner/animometer.css: Add rules to accomodate iOS. Get rid of prefixed flex properties for now. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/runner/animometer.css: Update the structure of the harness. Remove the JSON-per-test but keep the JSON of the whole test run. Use the full page in order to display the graph. * Animometer/developer.html: Update several of the JS file includes to UTF-8. Remove header and footer. Test results screen includes score, average, and worst 5% statistics. * Animometer/index.html: Make structure similar to developer.html. * Animometer/resources/debug-runner/animometer.css: Remove most of the button rules since they are superfluous. Move the progress bar to the top, fixed. Update the results page rules. * Animometer/resources/debug-runner/animometer.js: Remove most of the additions to sectionsManager since they are no longer needed. (setSectionHeader): Updates header of the section. (window.suitesManager._updateStartButtonState): Update selector. (showResults): Add keypress event for selecting different data for copy/paste. Update how the results are populated. Include full test JSON in a textarea, rather than requiring a button press. (showTestGraph): * Animometer/resources/debug-runner/tests.js: Update structure of Headers. Define different kinds of headers. Headers can control their title, and the text used as the cell contents, including class name. * Animometer/resources/extensions.js: (ResultsTable): Update to include a flattened version of the headers, used while populating table contents. Remove unneeded helper functions for creating the table. Rename "show" to "add". * Animometer/resources/runner/animometer.css: Update rules to accommodate the new structure. * Animometer/resources/runner/animometer.js: (window.sectionsManager.setSectionScore): Helper function to set the score and mean for a section. (window.sectionsManager.populateTable): Helper function to set the table. (window.benchmarkController.showResults): Refactor. (window.benchmarkController.selectResults): Update selectors. * Animometer/resources/runner/tests.js: Set Headers. Debug harness extends it. Update debug runner to have similar names to the basic runner. Include that page's CSS and remove extraneous CSS rules. Get rid of the statistics table #record. * Animometer/developer.html: Rename #home to #intro. Rename .spacer to hr. * Animometer/resources/debug-runner/animometer.css: Set to flexbox when selected. * Animometer/resources/debug-runner/animometer.js: Remove recordTable. (window.suitesManager._updateStartButtonState): Update selector to #intro. (setupRunningSectionStyle): Deleted. * Animometer/resources/runner/animometer.css: (#test-container.selected): Change to flex-box only when visible. Remove recordTable. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): * Animometer/resources/runner/tests.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * Animometer/tests/examples/resources/canvas-electrons.js: * Animometer/tests/examples/resources/canvas-stars.js: * Animometer/tests/misc/resources/compositing-transforms.js: * Animometer/tests/resources/main.js: * Animometer/tests/resources/stage.js: (StageBenchmark): Remove _recordTable. * Animometer/tests/simple/resources/simple-canvas-paths.js: * Animometer/tests/simple/resources/simple-canvas.js: * Animometer/tests/template/resources/template-canvas.js: * Animometer/tests/template/resources/template-css.js: * Animometer/tests/template/resources/template-svg.js: * Animometer/tests/text/resources/layering-text.js: * Animometer/resources/debug-runner/animometer.js: (willStartFirstIteration): Fix selector, since results-table is used in multiple places, so it cannot be an id. Make it possible to select the scores, or the whole table data, by cycling through different selections through key press of 's'. * Animometer/resources/runner/animometer.js: (window.benchmarkController.showResults): Attach a keypress handler if it hasn't been added already. (window.benchmarkController.selectResults): * Animometer/resources/runner/tests.js: Cycle through different ranges. Fix a few fly-by errors. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunnerState.prototype.prepareCurrentTest): Update the frame relative path since the files are now in the top directory instead of inside runner/. (BenchmarkRunner.prototype._runTestAndRecordResults): Incorrect reference to function. (BenchmarkRunner.prototype.step): Member variable is never used. A little stylistic cleanup. * Animometer/resources/debug-runner/benchmark-runner.js: * Animometer/resources/extensions.js: (window.DocumentExtension.createElement): * Animometer/tests/resources/main.js: (Benchmark.prototype.record): * Animometer/tests/resources/stage.js: (StageBenchmark.prototype.showResults): Reverse progress and message. The message appears less frequently than the progress. * Animometer/tests/simple/resources/simple-canvas.js: (SimpleCanvasBenchmark): Remove unused options. Add newer version of harness in a new page. Consolidate differences between the two pages. * Animometer/developer.html: Include runner/animometer.js. Rename the JS function to run the benchmark to startBenchmark() instead of startTest(). Rename #running to #test-container. * Animometer/index.html: Added. Similarly calls startBenchmark() and has #test-container. * Animometer/resources/debug-runner/animometer.css: Make the canvas 2:1 (1200px x 800px) instead of 4:3. Split out benchmarkRunnerClient and benchmarkController. * Animometer/resources/debug-runner/animometer.js: Move needed functions out of benchmarkRunnerClient, and leave the rest here to extend that object. Get rid of _resultsTable and move populating the results table into benchmarkController. Rename _resultsDashboard to results and make it accessible for other objects to use. (willAddTestFrame): This is unnecessary. Remove. (window.sectionsManager.showScore): Grab it from the results object instead of benchmarkRunnerClient. (window.sectionsManager.showSection): Deleted. Moved to runner/animometer.js. (window.benchmarkController._runBenchmark): Deleted. Mostly moved into _startBenchmark. (window.benchmarkController.startBenchmark): Refactor to call _startBenchmark. (window.benchmarkController.showResults): Include most of benchmarkRunnerClient.didFinishLastIteration() here. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._appendFrame): Remove unneeded call to willAddTestFrame. * Animometer/resources/extensions.js: (ResultsDashboard): Change the class to process the sampler data on-demand and hold onto that data for later referencing. (ResultsDashboard.prototype.toJSON): Deleted. (ResultsDashboard.prototype._processData): Rename toJSON to _processData since it's not really outputting JSON. Store the processed data into a member variable that can be referenced later. (ResultsDashboard.prototype.get data): Process the data if it hasn't already. (ResultsDashboard.prototype.get score): Process the data if it hasn't already, then return the aggregate score. (ResultsTable.prototype._showHeader): When outputting the results to a table, don't force the need for an empty children array. This was to allow for a header row in the table that spanned multiple columns. In the simpler harness, this is not needed. (ResultsTable.prototype._showEmptyCells): (ResultsTable.prototype._showTest): This hardcoded the columns. At least for the name and score, which is the bare minimum needed for the simpler harness, key off of the header name provided. * Animometer/resources/runner/animometer.css: Added. Use a similar 2:1 ratio. The score tables are split into the data and the headers, and are also displayed RTL so that a later patch allows a user to copy-paste the data easily. * Animometer/resources/runner/animometer.js: Added. Use a simpler version of benchmarkRunnerClient. The debug harness will extend these classes. (window.benchmarkController._startBenchmark): Used by both harnesses. (window.benchmarkController.startBenchmark): Set hard-coded options. (window.benchmarkController.showResults): Includes most of benchmarkRunnerClient.didFinishLastIteration() here. Get rid of utilities.js. Move it all into extensions.js. * Animometer/resources/extensions.js: * Animometer/tests/resources/utilities.js: Removed. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: Remove script link. * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: Ditto. * Animometer/tests/examples/canvas-electrons.html: Ditto. * Animometer/tests/examples/canvas-stars.html: Ditto. * Animometer/tests/misc/compositing-transforms.html: Ditto. * Animometer/tests/simple/simple-canvas-paths.html: Ditto. * Animometer/tests/template/template-canvas.html: Ditto. * Animometer/tests/template/template-css.html: Ditto. * Animometer/tests/template/template-svg.html: Ditto. * Animometer/tests/text/layering-text.html: Ditto. Split tests.js into two. Add a new suite to runner/tests.js. * Animometer/developer.html: Update the script order. Scripts from debug-runner/ will always build on those from runner/, and have the same name. * Animometer/resources/debug-runner/tests.js: Move "complex examples" suite into "miscellaneous tests". (Suite): Deleted. (Suite.prototype.prepare): Deleted. (Suite.prototype.run): Deleted. (suiteFromName): Deleted. * Animometer/resources/runner/tests.js: Added. Take definitions and functions needed by the test harness. Leave the test suites behind. (Suite): Moved from debug script. (Suite.prototype.prepare): Ditto. (Suite.prototype.run): Ditto. (suiteFromName): Ditto. (testFromName): Ditto. Move benchmark resources out into resources/debug-runner, and update URLs. * Animometer/developer.html: Renamed from PerformanceTests/Animometer/runner/animometer.html. * Animometer/resources/debug-runner/animometer.css: Renamed from PerformanceTests/Animometer/runner/resources/animometer.css. * Animometer/resources/debug-runner/animometer.js: Renamed from PerformanceTests/Animometer/runner/resources/animometer.js. * Animometer/resources/debug-runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/runner/resources/benchmark-runner.js. * Animometer/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/runner/resources/d3.min.js. * Animometer/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/runner/resources/graph.js. * Animometer/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/runner/resources/tests.js. Canonical link: https://commits.webkit.org/170645@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@194407 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2015-12-24 01:31:47 +00:00
<section id="test-container">
<div id="running-test" class="frame-container"></div>
<div id="progress">
<div id="progress-completed"></div>
</div>
</section>
Improvements to Animometer benchmark https://bugs.webkit.org/show_bug.cgi?id=157738 Reviewed by Dean Jackson. Provisionally reviewed by Said Abou-Hallawa. Include confidence interval for the final score, and store the canvas size in the serialization so that it is accurately shown in results. * Animometer/developer.html: Add a "confidence" div. * Animometer/index.html: Ditto. Convert "mean" to "confidence". * Animometer/resources/debug-runner/animometer.js: Look at options, and if the configuration is included, update the body class based on it (similar to what we do when we first load the page). That way, if you drag-and-drop previous results in, that configuration is reflected in the dashboard. Show the full confidence interval. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Style update. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: (_processData): Propagate the confidence interval values out and calculate the lower and upper bounds. For now, shortcut the aggregate calculation, since we only go through one iteration. (this._processData.calculateScore): Propagate the confidence interval out to be next to the score. Depending on the controller these values are calculated differently. (this._processData._getResultsProperty): Convenience function. (this._processData.get score): Refactor. (this._processData.get scoreLowerBound): Get the aggregate lower bound. (this._processData.get scoreUpperBound): Get the aggregate upper bound. (window.sectionsManager.setSectionScore): (window.benchmarkController._startBenchmark): When the benchmark starts, note the canvas size and add it to options. That way it will be automatically be serialized. (window.benchmarkController.showResults): Include the maximum deviation percentage. * Animometer/resources/runner/lines.svg: Make the background line up with the skew. * Animometer/resources/runner/tests.js: (Headers.details.text): Refactor. * Animometer/resources/statistics.js: (Statistics.largestDeviationPercentage): Convenience function to calculate the deviation percentage on either end and return the largest deviation. * Animometer/resources/strings.js: Allow specifying a regression profile to use instead of taking the one with the lowest error. Address an issue in the focus test when the regression calculated ends up overestimating the change point, causing a cascade of tougher ramps. The reason behind this is that at max complexity of an initial ramp, the frame length is very high, and it influences the second segment of the piecewise regression strongly, causing it to be very steep. As a result, the first segment, expected to be flat, ends up covering a higher range of complexity. That makes the change point much higher than it should be. To avoid this, we will add a sanity check on the maximum value of the ramp. If the regression's projected value at the maximum complexity of the current ramp is very slow (less than 20 fps), 1) reduce the maximum complexity by 20%, and 2) do not include the regression's change point in the change point estimator. That estimator is used as the midpoint of the next ramp, and including the change point from a poor regression can bake in the error. The controller already knows how to adjust for ramps that are too easy for the system. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Takes a preferred profile and gives that to Regression. (this._processData.calculateScore): With the ramp controller, take the profile of the ramp that was used the most when calculating the ramp's regression. That profile is what is used for the test's score. * Animometer/resources/statistics.js: (Regression.Utilities.createClass): Update to take an options object which can specify a profile to calculate with. Otherwise it will continue to use both and select the one with the lower error. (_calculateRegression): Fix an issue where we claim 0 error if the regression calculation fails due to divide-by-zero. Instead reject that regression calculation by giving it Number.MAX_VALUE. * Animometer/resources/strings.js: New strings for marking a regression as flat or slope. * Animometer/tests/resources/main.js: (RampController): Rename the thresholds for clarity. Add a threshold that, if exceeded, will lower the maximum complexity of the next ramp. (tune): Relax the cdf check to consider whether the interval definitely falls in the desired frame length threshold. (processSamples): Include the profile in the ramp. Update ramp controller test. Increase the length of the test to 30 seconds, and extend the interval to 120 ms during sampling. Improve the estimation of the ramp parameters. * Animometer/developer.html: Change default to 30 seconds, and don't show the progress bar by default. * Animometer/resources/runner/animometer.js: Change default to 30 seconds. * Animometer/tests/resources/main.js: A number of improvements to the ramp controller, in the order in which they appear in the patch: - With a longer test length use longer ramps with longer intervals to get more data at each complexity. Keep the 100 ms interval length during the ramp up phase since we don't need to spend more time there to find the right order of magnitude, but increase it during the ramps to 120 ms. - The ramp linearly interpolates the complexity to render based on its timestamp, but it would never sample the minimum complexity. Instead of lerping max to min complexity from time 0 to t where t is the ramp length, instead lerp from 0 to (t - intervalSampleLength) so that we can have at least one interval sample at the min complexity for that ramp. - Some regression calculations only come out with one line segment rather than the two we expect. This could be due to a noisy ramp or the ramp's range is too narrow. If that's the case, influence the minimum complexity of the next ramp towards the lowest bound of 1, so that we ensure that at least part of the ramp is covering a complexity range that the system can handle at full 60. - Remove an assignment to interpolatedFrameLength since that is never subsequently used. Update the format used to serialize the results for analysis. Each data point used to have named properties for fields like complexity and frame rate. In addition the serialized numbers had rounding errors that took up many characters. Update the format by introducing a new data container called SampleData, which contains a field map. The map maps a string to an array index. Each data point is an array, so, to get a stat, use the field map to get the array index into the data point. This allows future versions to track other data, and reduces the size of the output string by two-thirds. * Animometer/resources/extensions.js: (Utilities.toFixedNumber): Add convenience function that truncates the number to a fixed precision, and converts it back to a number. (SampleData): New class that contains sample data and a field map that maps properties to an array index. (get length): Number of data points. (addField): Add a field to the field map. (push): Add a data point. (sort): Sort the data. (slice): Return new SampleData object with sliced data. (forEach): Iterate over the data with provided function. (createDatum): (getFieldInDatum): Returns the data point associated with the field name by looking it up in the field map in the datum provided, which can be the datum object itself (an array) or an index into the data member variable. (setFieldInDatum): Sets the data point associated with the field name. (at): Returns the data point at the provided index. (toArray): Serializes the data where the field map serves as property names for each point. * Animometer/resources/debug-runner/graph.js: (updateGraphData): Remove unused _testData. Convert the data to the old array format for the graph to use, since the old format was much easier to work with when displaying the graphs. (onGraphTypeChanged): For some controllers, no alternative score or mean is provided. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: Refactor to use SampleData. Update JSON output to only go to 3 digits of precision for purposes of reducing the data size. * Animometer/resources/strings.js: Add new strings to put into the field maps. * Animometer/tests/resources/main.js: Refactor to use SampleData. * Animometer/developer.html: * Animometer/index.html: Restructure results table for both pages. Add charset attribute to tests.js include. * Animometer/resources/debug-runner/animometer.css: Clear out styles from release runner. * Animometer/resources/debug-runner/graph.js: (onGraphTypeChanged): Update score and mean if bootstrap results are available from the controller, since not all controllers do bootstrapping. * Animometer/resources/debug-runner/tests.js: Update header text. * Animometer/resources/runner/animometer.css: Include confidence interval in results. * Animometer/resources/runner/animometer.js: (ResultsTable._addHeader): Header contents can be HTML, so use innerHTML instead. (ResultsTable._addBody): Add tbody element. (ResultsTable._addTest): Allow a data cell to invoke a JS function to get its contents. (window.benchmarkController.showResults): Add table that includes tests' confidence intervals. * Animometer/resources/runner/tests.js: (Headers.details.text): Add new details table that includes bootstrap confidence interval. The interval can be asymmetric, but for simplicity, report the maximum deviation percentage on either side of the bootstrap median. * Animometer/resources/statistics.js: (bootstrap): Include the confidence percentage in the return object. Report canvas size in results. * Animometer/developer.html: Add markup to indicate whether a small, medium, or large canvas was used. * Animometer/index.html: Ditto. * Animometer/resources/debug-runner/animometer.js: Call determineCanvasSize(). * Animometer/resources/runner/animometer.css: Update styles to set the canvas based on the body class size. * Animometer/resources/runner/animometer.js: (window.benchmarkController.initialize): Update styles to set the canvas based on the body class size. (window.benchmarkController.determineCanvasSize): Run various media queries and set the body class based on the size of the device. * Animometer/developer.html: Refactor to include the main CSS file, and redo the layout so that it doesn't rely on flexbox. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: (updateDisplay): Since various parts of the script alter the body class, we can't replace the className directly. Instead, remove all display-based values and then add the one that was selected. * Animometer/resources/debug-runner/graph.js: (updateGraphData): To set the size of the graph, use window.innerHeight. * Animometer/resources/runner/animometer.js: (window.sectionsManager.showSection): Since various parts of the script alter the body class, we can't replace the className directly. Remove all of the section classes individually and then add the one desired. * Animometer/tests/resources/stage.css: Remove -apple-system as a font to use in the stage. Canonical link: https://commits.webkit.org/177081@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@202314 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-06-22 03:11:15 +00:00
Split benchmark into two different pages https://bugs.webkit.org/show_bug.cgi?id=152458 Reviewed by Simon Fraser. Address comments. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): When the testing is complete the frame owning the sampler goes away, and a later call to get the JSON data is no longer available. Process the data right here, instead, and just reference it when displaying the results in ResultsDashboard.prototype._processData. * Animometer/resources/extensions.js: (Array.prototype.fill.Array.prototype.fill): Add a null check. Remove braces around single-line clause. (Array.prototype.find.Array.prototype.find): Update the null check. (ResultsDashboard.prototype._processData): Use the already-processed data. * Animometer/resources/runner/animometer.css: (.frame-container > iframe): Remove calc(). Move Array functions to extensions.js since that is included by the harness. Add ES6 Array polyfills. * Animometer/resources/algorithm.js: (Array.prototype.swap): Moved to extensions.js. * Animometer/resources/extensions.js: (Array.prototype.swap): (Array.prototype.fill): Added. (Array.prototype.find): Added. Adjust styles for iPad. * Animometer/resources/runner/animometer.css: (@media screen and (min-device-width: 768px)): Apply to iPad as well. (@media screen and (max-device-width: 1024px)): Update width for iPads. Adjustment styles for iOS. * Animometer/developer.html: Different divs contain the iframe, so use a class instead and update the style rules. * Animometer/index.html: * Animometer/resources/debug-runner/animometer.css: Remove extraneous rules. (@media screen and (min-device-width: 1800px)): Move this up. * Animometer/resources/runner/animometer.css: Add rules to accomodate iOS. Get rid of prefixed flex properties for now. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/runner/animometer.css: Update the structure of the harness. Remove the JSON-per-test but keep the JSON of the whole test run. Use the full page in order to display the graph. * Animometer/developer.html: Update several of the JS file includes to UTF-8. Remove header and footer. Test results screen includes score, average, and worst 5% statistics. * Animometer/index.html: Make structure similar to developer.html. * Animometer/resources/debug-runner/animometer.css: Remove most of the button rules since they are superfluous. Move the progress bar to the top, fixed. Update the results page rules. * Animometer/resources/debug-runner/animometer.js: Remove most of the additions to sectionsManager since they are no longer needed. (setSectionHeader): Updates header of the section. (window.suitesManager._updateStartButtonState): Update selector. (showResults): Add keypress event for selecting different data for copy/paste. Update how the results are populated. Include full test JSON in a textarea, rather than requiring a button press. (showTestGraph): * Animometer/resources/debug-runner/tests.js: Update structure of Headers. Define different kinds of headers. Headers can control their title, and the text used as the cell contents, including class name. * Animometer/resources/extensions.js: (ResultsTable): Update to include a flattened version of the headers, used while populating table contents. Remove unneeded helper functions for creating the table. Rename "show" to "add". * Animometer/resources/runner/animometer.css: Update rules to accommodate the new structure. * Animometer/resources/runner/animometer.js: (window.sectionsManager.setSectionScore): Helper function to set the score and mean for a section. (window.sectionsManager.populateTable): Helper function to set the table. (window.benchmarkController.showResults): Refactor. (window.benchmarkController.selectResults): Update selectors. * Animometer/resources/runner/tests.js: Set Headers. Debug harness extends it. Update debug runner to have similar names to the basic runner. Include that page's CSS and remove extraneous CSS rules. Get rid of the statistics table #record. * Animometer/developer.html: Rename #home to #intro. Rename .spacer to hr. * Animometer/resources/debug-runner/animometer.css: Set to flexbox when selected. * Animometer/resources/debug-runner/animometer.js: Remove recordTable. (window.suitesManager._updateStartButtonState): Update selector to #intro. (setupRunningSectionStyle): Deleted. * Animometer/resources/runner/animometer.css: (#test-container.selected): Change to flex-box only when visible. Remove recordTable. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): * Animometer/resources/runner/tests.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * Animometer/tests/examples/resources/canvas-electrons.js: * Animometer/tests/examples/resources/canvas-stars.js: * Animometer/tests/misc/resources/compositing-transforms.js: * Animometer/tests/resources/main.js: * Animometer/tests/resources/stage.js: (StageBenchmark): Remove _recordTable. * Animometer/tests/simple/resources/simple-canvas-paths.js: * Animometer/tests/simple/resources/simple-canvas.js: * Animometer/tests/template/resources/template-canvas.js: * Animometer/tests/template/resources/template-css.js: * Animometer/tests/template/resources/template-svg.js: * Animometer/tests/text/resources/layering-text.js: * Animometer/resources/debug-runner/animometer.js: (willStartFirstIteration): Fix selector, since results-table is used in multiple places, so it cannot be an id. Make it possible to select the scores, or the whole table data, by cycling through different selections through key press of 's'. * Animometer/resources/runner/animometer.js: (window.benchmarkController.showResults): Attach a keypress handler if it hasn't been added already. (window.benchmarkController.selectResults): * Animometer/resources/runner/tests.js: Cycle through different ranges. Fix a few fly-by errors. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunnerState.prototype.prepareCurrentTest): Update the frame relative path since the files are now in the top directory instead of inside runner/. (BenchmarkRunner.prototype._runTestAndRecordResults): Incorrect reference to function. (BenchmarkRunner.prototype.step): Member variable is never used. A little stylistic cleanup. * Animometer/resources/debug-runner/benchmark-runner.js: * Animometer/resources/extensions.js: (window.DocumentExtension.createElement): * Animometer/tests/resources/main.js: (Benchmark.prototype.record): * Animometer/tests/resources/stage.js: (StageBenchmark.prototype.showResults): Reverse progress and message. The message appears less frequently than the progress. * Animometer/tests/simple/resources/simple-canvas.js: (SimpleCanvasBenchmark): Remove unused options. Add newer version of harness in a new page. Consolidate differences between the two pages. * Animometer/developer.html: Include runner/animometer.js. Rename the JS function to run the benchmark to startBenchmark() instead of startTest(). Rename #running to #test-container. * Animometer/index.html: Added. Similarly calls startBenchmark() and has #test-container. * Animometer/resources/debug-runner/animometer.css: Make the canvas 2:1 (1200px x 800px) instead of 4:3. Split out benchmarkRunnerClient and benchmarkController. * Animometer/resources/debug-runner/animometer.js: Move needed functions out of benchmarkRunnerClient, and leave the rest here to extend that object. Get rid of _resultsTable and move populating the results table into benchmarkController. Rename _resultsDashboard to results and make it accessible for other objects to use. (willAddTestFrame): This is unnecessary. Remove. (window.sectionsManager.showScore): Grab it from the results object instead of benchmarkRunnerClient. (window.sectionsManager.showSection): Deleted. Moved to runner/animometer.js. (window.benchmarkController._runBenchmark): Deleted. Mostly moved into _startBenchmark. (window.benchmarkController.startBenchmark): Refactor to call _startBenchmark. (window.benchmarkController.showResults): Include most of benchmarkRunnerClient.didFinishLastIteration() here. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._appendFrame): Remove unneeded call to willAddTestFrame. * Animometer/resources/extensions.js: (ResultsDashboard): Change the class to process the sampler data on-demand and hold onto that data for later referencing. (ResultsDashboard.prototype.toJSON): Deleted. (ResultsDashboard.prototype._processData): Rename toJSON to _processData since it's not really outputting JSON. Store the processed data into a member variable that can be referenced later. (ResultsDashboard.prototype.get data): Process the data if it hasn't already. (ResultsDashboard.prototype.get score): Process the data if it hasn't already, then return the aggregate score. (ResultsTable.prototype._showHeader): When outputting the results to a table, don't force the need for an empty children array. This was to allow for a header row in the table that spanned multiple columns. In the simpler harness, this is not needed. (ResultsTable.prototype._showEmptyCells): (ResultsTable.prototype._showTest): This hardcoded the columns. At least for the name and score, which is the bare minimum needed for the simpler harness, key off of the header name provided. * Animometer/resources/runner/animometer.css: Added. Use a similar 2:1 ratio. The score tables are split into the data and the headers, and are also displayed RTL so that a later patch allows a user to copy-paste the data easily. * Animometer/resources/runner/animometer.js: Added. Use a simpler version of benchmarkRunnerClient. The debug harness will extend these classes. (window.benchmarkController._startBenchmark): Used by both harnesses. (window.benchmarkController.startBenchmark): Set hard-coded options. (window.benchmarkController.showResults): Includes most of benchmarkRunnerClient.didFinishLastIteration() here. Get rid of utilities.js. Move it all into extensions.js. * Animometer/resources/extensions.js: * Animometer/tests/resources/utilities.js: Removed. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: Remove script link. * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: Ditto. * Animometer/tests/examples/canvas-electrons.html: Ditto. * Animometer/tests/examples/canvas-stars.html: Ditto. * Animometer/tests/misc/compositing-transforms.html: Ditto. * Animometer/tests/simple/simple-canvas-paths.html: Ditto. * Animometer/tests/template/template-canvas.html: Ditto. * Animometer/tests/template/template-css.html: Ditto. * Animometer/tests/template/template-svg.html: Ditto. * Animometer/tests/text/layering-text.html: Ditto. Split tests.js into two. Add a new suite to runner/tests.js. * Animometer/developer.html: Update the script order. Scripts from debug-runner/ will always build on those from runner/, and have the same name. * Animometer/resources/debug-runner/tests.js: Move "complex examples" suite into "miscellaneous tests". (Suite): Deleted. (Suite.prototype.prepare): Deleted. (Suite.prototype.run): Deleted. (suiteFromName): Deleted. * Animometer/resources/runner/tests.js: Added. Take definitions and functions needed by the test harness. Leave the test suites behind. (Suite): Moved from debug script. (Suite.prototype.prepare): Ditto. (Suite.prototype.run): Ditto. (suiteFromName): Ditto. (testFromName): Ditto. Move benchmark resources out into resources/debug-runner, and update URLs. * Animometer/developer.html: Renamed from PerformanceTests/Animometer/runner/animometer.html. * Animometer/resources/debug-runner/animometer.css: Renamed from PerformanceTests/Animometer/runner/resources/animometer.css. * Animometer/resources/debug-runner/animometer.js: Renamed from PerformanceTests/Animometer/runner/resources/animometer.js. * Animometer/resources/debug-runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/runner/resources/benchmark-runner.js. * Animometer/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/runner/resources/d3.min.js. * Animometer/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/runner/resources/graph.js. * Animometer/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/runner/resources/tests.js. Canonical link: https://commits.webkit.org/170645@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@194407 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2015-12-24 01:31:47 +00:00
<section id="results">
Improvements to Animometer benchmark https://bugs.webkit.org/show_bug.cgi?id=157738 Reviewed by Dean Jackson. Provisionally reviewed by Said Abou-Hallawa. Include confidence interval for the final score, and store the canvas size in the serialization so that it is accurately shown in results. * Animometer/developer.html: Add a "confidence" div. * Animometer/index.html: Ditto. Convert "mean" to "confidence". * Animometer/resources/debug-runner/animometer.js: Look at options, and if the configuration is included, update the body class based on it (similar to what we do when we first load the page). That way, if you drag-and-drop previous results in, that configuration is reflected in the dashboard. Show the full confidence interval. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Style update. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: (_processData): Propagate the confidence interval values out and calculate the lower and upper bounds. For now, shortcut the aggregate calculation, since we only go through one iteration. (this._processData.calculateScore): Propagate the confidence interval out to be next to the score. Depending on the controller these values are calculated differently. (this._processData._getResultsProperty): Convenience function. (this._processData.get score): Refactor. (this._processData.get scoreLowerBound): Get the aggregate lower bound. (this._processData.get scoreUpperBound): Get the aggregate upper bound. (window.sectionsManager.setSectionScore): (window.benchmarkController._startBenchmark): When the benchmark starts, note the canvas size and add it to options. That way it will be automatically be serialized. (window.benchmarkController.showResults): Include the maximum deviation percentage. * Animometer/resources/runner/lines.svg: Make the background line up with the skew. * Animometer/resources/runner/tests.js: (Headers.details.text): Refactor. * Animometer/resources/statistics.js: (Statistics.largestDeviationPercentage): Convenience function to calculate the deviation percentage on either end and return the largest deviation. * Animometer/resources/strings.js: Allow specifying a regression profile to use instead of taking the one with the lowest error. Address an issue in the focus test when the regression calculated ends up overestimating the change point, causing a cascade of tougher ramps. The reason behind this is that at max complexity of an initial ramp, the frame length is very high, and it influences the second segment of the piecewise regression strongly, causing it to be very steep. As a result, the first segment, expected to be flat, ends up covering a higher range of complexity. That makes the change point much higher than it should be. To avoid this, we will add a sanity check on the maximum value of the ramp. If the regression's projected value at the maximum complexity of the current ramp is very slow (less than 20 fps), 1) reduce the maximum complexity by 20%, and 2) do not include the regression's change point in the change point estimator. That estimator is used as the midpoint of the next ramp, and including the change point from a poor regression can bake in the error. The controller already knows how to adjust for ramps that are too easy for the system. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Takes a preferred profile and gives that to Regression. (this._processData.calculateScore): With the ramp controller, take the profile of the ramp that was used the most when calculating the ramp's regression. That profile is what is used for the test's score. * Animometer/resources/statistics.js: (Regression.Utilities.createClass): Update to take an options object which can specify a profile to calculate with. Otherwise it will continue to use both and select the one with the lower error. (_calculateRegression): Fix an issue where we claim 0 error if the regression calculation fails due to divide-by-zero. Instead reject that regression calculation by giving it Number.MAX_VALUE. * Animometer/resources/strings.js: New strings for marking a regression as flat or slope. * Animometer/tests/resources/main.js: (RampController): Rename the thresholds for clarity. Add a threshold that, if exceeded, will lower the maximum complexity of the next ramp. (tune): Relax the cdf check to consider whether the interval definitely falls in the desired frame length threshold. (processSamples): Include the profile in the ramp. Update ramp controller test. Increase the length of the test to 30 seconds, and extend the interval to 120 ms during sampling. Improve the estimation of the ramp parameters. * Animometer/developer.html: Change default to 30 seconds, and don't show the progress bar by default. * Animometer/resources/runner/animometer.js: Change default to 30 seconds. * Animometer/tests/resources/main.js: A number of improvements to the ramp controller, in the order in which they appear in the patch: - With a longer test length use longer ramps with longer intervals to get more data at each complexity. Keep the 100 ms interval length during the ramp up phase since we don't need to spend more time there to find the right order of magnitude, but increase it during the ramps to 120 ms. - The ramp linearly interpolates the complexity to render based on its timestamp, but it would never sample the minimum complexity. Instead of lerping max to min complexity from time 0 to t where t is the ramp length, instead lerp from 0 to (t - intervalSampleLength) so that we can have at least one interval sample at the min complexity for that ramp. - Some regression calculations only come out with one line segment rather than the two we expect. This could be due to a noisy ramp or the ramp's range is too narrow. If that's the case, influence the minimum complexity of the next ramp towards the lowest bound of 1, so that we ensure that at least part of the ramp is covering a complexity range that the system can handle at full 60. - Remove an assignment to interpolatedFrameLength since that is never subsequently used. Update the format used to serialize the results for analysis. Each data point used to have named properties for fields like complexity and frame rate. In addition the serialized numbers had rounding errors that took up many characters. Update the format by introducing a new data container called SampleData, which contains a field map. The map maps a string to an array index. Each data point is an array, so, to get a stat, use the field map to get the array index into the data point. This allows future versions to track other data, and reduces the size of the output string by two-thirds. * Animometer/resources/extensions.js: (Utilities.toFixedNumber): Add convenience function that truncates the number to a fixed precision, and converts it back to a number. (SampleData): New class that contains sample data and a field map that maps properties to an array index. (get length): Number of data points. (addField): Add a field to the field map. (push): Add a data point. (sort): Sort the data. (slice): Return new SampleData object with sliced data. (forEach): Iterate over the data with provided function. (createDatum): (getFieldInDatum): Returns the data point associated with the field name by looking it up in the field map in the datum provided, which can be the datum object itself (an array) or an index into the data member variable. (setFieldInDatum): Sets the data point associated with the field name. (at): Returns the data point at the provided index. (toArray): Serializes the data where the field map serves as property names for each point. * Animometer/resources/debug-runner/graph.js: (updateGraphData): Remove unused _testData. Convert the data to the old array format for the graph to use, since the old format was much easier to work with when displaying the graphs. (onGraphTypeChanged): For some controllers, no alternative score or mean is provided. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: Refactor to use SampleData. Update JSON output to only go to 3 digits of precision for purposes of reducing the data size. * Animometer/resources/strings.js: Add new strings to put into the field maps. * Animometer/tests/resources/main.js: Refactor to use SampleData. * Animometer/developer.html: * Animometer/index.html: Restructure results table for both pages. Add charset attribute to tests.js include. * Animometer/resources/debug-runner/animometer.css: Clear out styles from release runner. * Animometer/resources/debug-runner/graph.js: (onGraphTypeChanged): Update score and mean if bootstrap results are available from the controller, since not all controllers do bootstrapping. * Animometer/resources/debug-runner/tests.js: Update header text. * Animometer/resources/runner/animometer.css: Include confidence interval in results. * Animometer/resources/runner/animometer.js: (ResultsTable._addHeader): Header contents can be HTML, so use innerHTML instead. (ResultsTable._addBody): Add tbody element. (ResultsTable._addTest): Allow a data cell to invoke a JS function to get its contents. (window.benchmarkController.showResults): Add table that includes tests' confidence intervals. * Animometer/resources/runner/tests.js: (Headers.details.text): Add new details table that includes bootstrap confidence interval. The interval can be asymmetric, but for simplicity, report the maximum deviation percentage on either side of the bootstrap median. * Animometer/resources/statistics.js: (bootstrap): Include the confidence percentage in the return object. Report canvas size in results. * Animometer/developer.html: Add markup to indicate whether a small, medium, or large canvas was used. * Animometer/index.html: Ditto. * Animometer/resources/debug-runner/animometer.js: Call determineCanvasSize(). * Animometer/resources/runner/animometer.css: Update styles to set the canvas based on the body class size. * Animometer/resources/runner/animometer.js: (window.benchmarkController.initialize): Update styles to set the canvas based on the body class size. (window.benchmarkController.determineCanvasSize): Run various media queries and set the body class based on the size of the device. * Animometer/developer.html: Refactor to include the main CSS file, and redo the layout so that it doesn't rely on flexbox. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: (updateDisplay): Since various parts of the script alter the body class, we can't replace the className directly. Instead, remove all display-based values and then add the one that was selected. * Animometer/resources/debug-runner/graph.js: (updateGraphData): To set the size of the graph, use window.innerHeight. * Animometer/resources/runner/animometer.js: (window.sectionsManager.showSection): Since various parts of the script alter the body class, we can't replace the className directly. Remove all of the section classes individually and then add the one desired. * Animometer/tests/resources/stage.css: Remove -apple-system as a font to use in the stage. Canonical link: https://commits.webkit.org/177081@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@202314 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-06-22 03:11:15 +00:00
<div class="body">
<h1>MotionMark score</h1>
<div class="detail">
<span class="small">on a small screen (phone)</span>
<span class="medium">on a medium screen (laptop, tablet)</span>
<span class="large">on a large screen (desktop)</span>
Split benchmark into two different pages https://bugs.webkit.org/show_bug.cgi?id=152458 Reviewed by Simon Fraser. Address comments. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): When the testing is complete the frame owning the sampler goes away, and a later call to get the JSON data is no longer available. Process the data right here, instead, and just reference it when displaying the results in ResultsDashboard.prototype._processData. * Animometer/resources/extensions.js: (Array.prototype.fill.Array.prototype.fill): Add a null check. Remove braces around single-line clause. (Array.prototype.find.Array.prototype.find): Update the null check. (ResultsDashboard.prototype._processData): Use the already-processed data. * Animometer/resources/runner/animometer.css: (.frame-container > iframe): Remove calc(). Move Array functions to extensions.js since that is included by the harness. Add ES6 Array polyfills. * Animometer/resources/algorithm.js: (Array.prototype.swap): Moved to extensions.js. * Animometer/resources/extensions.js: (Array.prototype.swap): (Array.prototype.fill): Added. (Array.prototype.find): Added. Adjust styles for iPad. * Animometer/resources/runner/animometer.css: (@media screen and (min-device-width: 768px)): Apply to iPad as well. (@media screen and (max-device-width: 1024px)): Update width for iPads. Adjustment styles for iOS. * Animometer/developer.html: Different divs contain the iframe, so use a class instead and update the style rules. * Animometer/index.html: * Animometer/resources/debug-runner/animometer.css: Remove extraneous rules. (@media screen and (min-device-width: 1800px)): Move this up. * Animometer/resources/runner/animometer.css: Add rules to accomodate iOS. Get rid of prefixed flex properties for now. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/runner/animometer.css: Update the structure of the harness. Remove the JSON-per-test but keep the JSON of the whole test run. Use the full page in order to display the graph. * Animometer/developer.html: Update several of the JS file includes to UTF-8. Remove header and footer. Test results screen includes score, average, and worst 5% statistics. * Animometer/index.html: Make structure similar to developer.html. * Animometer/resources/debug-runner/animometer.css: Remove most of the button rules since they are superfluous. Move the progress bar to the top, fixed. Update the results page rules. * Animometer/resources/debug-runner/animometer.js: Remove most of the additions to sectionsManager since they are no longer needed. (setSectionHeader): Updates header of the section. (window.suitesManager._updateStartButtonState): Update selector. (showResults): Add keypress event for selecting different data for copy/paste. Update how the results are populated. Include full test JSON in a textarea, rather than requiring a button press. (showTestGraph): * Animometer/resources/debug-runner/tests.js: Update structure of Headers. Define different kinds of headers. Headers can control their title, and the text used as the cell contents, including class name. * Animometer/resources/extensions.js: (ResultsTable): Update to include a flattened version of the headers, used while populating table contents. Remove unneeded helper functions for creating the table. Rename "show" to "add". * Animometer/resources/runner/animometer.css: Update rules to accommodate the new structure. * Animometer/resources/runner/animometer.js: (window.sectionsManager.setSectionScore): Helper function to set the score and mean for a section. (window.sectionsManager.populateTable): Helper function to set the table. (window.benchmarkController.showResults): Refactor. (window.benchmarkController.selectResults): Update selectors. * Animometer/resources/runner/tests.js: Set Headers. Debug harness extends it. Update debug runner to have similar names to the basic runner. Include that page's CSS and remove extraneous CSS rules. Get rid of the statistics table #record. * Animometer/developer.html: Rename #home to #intro. Rename .spacer to hr. * Animometer/resources/debug-runner/animometer.css: Set to flexbox when selected. * Animometer/resources/debug-runner/animometer.js: Remove recordTable. (window.suitesManager._updateStartButtonState): Update selector to #intro. (setupRunningSectionStyle): Deleted. * Animometer/resources/runner/animometer.css: (#test-container.selected): Change to flex-box only when visible. Remove recordTable. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): * Animometer/resources/runner/tests.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * Animometer/tests/examples/resources/canvas-electrons.js: * Animometer/tests/examples/resources/canvas-stars.js: * Animometer/tests/misc/resources/compositing-transforms.js: * Animometer/tests/resources/main.js: * Animometer/tests/resources/stage.js: (StageBenchmark): Remove _recordTable. * Animometer/tests/simple/resources/simple-canvas-paths.js: * Animometer/tests/simple/resources/simple-canvas.js: * Animometer/tests/template/resources/template-canvas.js: * Animometer/tests/template/resources/template-css.js: * Animometer/tests/template/resources/template-svg.js: * Animometer/tests/text/resources/layering-text.js: * Animometer/resources/debug-runner/animometer.js: (willStartFirstIteration): Fix selector, since results-table is used in multiple places, so it cannot be an id. Make it possible to select the scores, or the whole table data, by cycling through different selections through key press of 's'. * Animometer/resources/runner/animometer.js: (window.benchmarkController.showResults): Attach a keypress handler if it hasn't been added already. (window.benchmarkController.selectResults): * Animometer/resources/runner/tests.js: Cycle through different ranges. Fix a few fly-by errors. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunnerState.prototype.prepareCurrentTest): Update the frame relative path since the files are now in the top directory instead of inside runner/. (BenchmarkRunner.prototype._runTestAndRecordResults): Incorrect reference to function. (BenchmarkRunner.prototype.step): Member variable is never used. A little stylistic cleanup. * Animometer/resources/debug-runner/benchmark-runner.js: * Animometer/resources/extensions.js: (window.DocumentExtension.createElement): * Animometer/tests/resources/main.js: (Benchmark.prototype.record): * Animometer/tests/resources/stage.js: (StageBenchmark.prototype.showResults): Reverse progress and message. The message appears less frequently than the progress. * Animometer/tests/simple/resources/simple-canvas.js: (SimpleCanvasBenchmark): Remove unused options. Add newer version of harness in a new page. Consolidate differences between the two pages. * Animometer/developer.html: Include runner/animometer.js. Rename the JS function to run the benchmark to startBenchmark() instead of startTest(). Rename #running to #test-container. * Animometer/index.html: Added. Similarly calls startBenchmark() and has #test-container. * Animometer/resources/debug-runner/animometer.css: Make the canvas 2:1 (1200px x 800px) instead of 4:3. Split out benchmarkRunnerClient and benchmarkController. * Animometer/resources/debug-runner/animometer.js: Move needed functions out of benchmarkRunnerClient, and leave the rest here to extend that object. Get rid of _resultsTable and move populating the results table into benchmarkController. Rename _resultsDashboard to results and make it accessible for other objects to use. (willAddTestFrame): This is unnecessary. Remove. (window.sectionsManager.showScore): Grab it from the results object instead of benchmarkRunnerClient. (window.sectionsManager.showSection): Deleted. Moved to runner/animometer.js. (window.benchmarkController._runBenchmark): Deleted. Mostly moved into _startBenchmark. (window.benchmarkController.startBenchmark): Refactor to call _startBenchmark. (window.benchmarkController.showResults): Include most of benchmarkRunnerClient.didFinishLastIteration() here. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._appendFrame): Remove unneeded call to willAddTestFrame. * Animometer/resources/extensions.js: (ResultsDashboard): Change the class to process the sampler data on-demand and hold onto that data for later referencing. (ResultsDashboard.prototype.toJSON): Deleted. (ResultsDashboard.prototype._processData): Rename toJSON to _processData since it's not really outputting JSON. Store the processed data into a member variable that can be referenced later. (ResultsDashboard.prototype.get data): Process the data if it hasn't already. (ResultsDashboard.prototype.get score): Process the data if it hasn't already, then return the aggregate score. (ResultsTable.prototype._showHeader): When outputting the results to a table, don't force the need for an empty children array. This was to allow for a header row in the table that spanned multiple columns. In the simpler harness, this is not needed. (ResultsTable.prototype._showEmptyCells): (ResultsTable.prototype._showTest): This hardcoded the columns. At least for the name and score, which is the bare minimum needed for the simpler harness, key off of the header name provided. * Animometer/resources/runner/animometer.css: Added. Use a similar 2:1 ratio. The score tables are split into the data and the headers, and are also displayed RTL so that a later patch allows a user to copy-paste the data easily. * Animometer/resources/runner/animometer.js: Added. Use a simpler version of benchmarkRunnerClient. The debug harness will extend these classes. (window.benchmarkController._startBenchmark): Used by both harnesses. (window.benchmarkController.startBenchmark): Set hard-coded options. (window.benchmarkController.showResults): Includes most of benchmarkRunnerClient.didFinishLastIteration() here. Get rid of utilities.js. Move it all into extensions.js. * Animometer/resources/extensions.js: * Animometer/tests/resources/utilities.js: Removed. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: Remove script link. * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: Ditto. * Animometer/tests/examples/canvas-electrons.html: Ditto. * Animometer/tests/examples/canvas-stars.html: Ditto. * Animometer/tests/misc/compositing-transforms.html: Ditto. * Animometer/tests/simple/simple-canvas-paths.html: Ditto. * Animometer/tests/template/template-canvas.html: Ditto. * Animometer/tests/template/template-css.html: Ditto. * Animometer/tests/template/template-svg.html: Ditto. * Animometer/tests/text/layering-text.html: Ditto. Split tests.js into two. Add a new suite to runner/tests.js. * Animometer/developer.html: Update the script order. Scripts from debug-runner/ will always build on those from runner/, and have the same name. * Animometer/resources/debug-runner/tests.js: Move "complex examples" suite into "miscellaneous tests". (Suite): Deleted. (Suite.prototype.prepare): Deleted. (Suite.prototype.run): Deleted. (suiteFromName): Deleted. * Animometer/resources/runner/tests.js: Added. Take definitions and functions needed by the test harness. Leave the test suites behind. (Suite): Moved from debug script. (Suite.prototype.prepare): Ditto. (Suite.prototype.run): Ditto. (suiteFromName): Ditto. (testFromName): Ditto. Move benchmark resources out into resources/debug-runner, and update URLs. * Animometer/developer.html: Renamed from PerformanceTests/Animometer/runner/animometer.html. * Animometer/resources/debug-runner/animometer.css: Renamed from PerformanceTests/Animometer/runner/resources/animometer.css. * Animometer/resources/debug-runner/animometer.js: Renamed from PerformanceTests/Animometer/runner/resources/animometer.js. * Animometer/resources/debug-runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/runner/resources/benchmark-runner.js. * Animometer/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/runner/resources/d3.min.js. * Animometer/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/runner/resources/graph.js. * Animometer/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/runner/resources/tests.js. Canonical link: https://commits.webkit.org/170645@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@194407 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2015-12-24 01:31:47 +00:00
</div>
[MotionMark] Add support for version numbers https://bugs.webkit.org/show_bug.cgi?id=186479 Reviewed by Said Abou-Hallawa. Add support for displaying the version number as well as including it in the JSON results. When loading the front page, script replaces any element with classname version with the version number of the benchmark, which is stored in Strings.version. The JSON structure for the results includes a new version property: { "version": "1.0", "options": { ... }, "data": [ ... ] } When dragging a results file, the version listed will come from the JSON file. Older results will not have had the version property, in which case it will default to "1.0". * MotionMark/index.html: Update title to some other default. Script will update it. Include the version number in the logo title. * MotionMark/developer.html: Ditto. * MotionMark/about.html: Ditto. * MotionMark/resources/runner/motionmark.js: (ResultsDashboard): Update constructor to include version. This is used when serializing results out to JSON, and displaying the results panel in developer mode. (ResultsDashboard._processData): When running the benchmark, include benchmark version string in the results object. (ResultsDashboard.version): (window.benchmarkRunnerClient.willStartFirstIteration): When running the benchmark, pass the benchmark version string to the dashboard, which holds the results. (window.sectionsManager.setSectionVersion): Helper function to update the element in the section with the class name version. (window.benchmarkController.initialize): Populate all DOM elements with class name "version" with the version string. Update the page title. (window.benchmarkController.showResults): When showing results, update the version string based on what is included in the JSON results, which would be the same as the benchmark version. * MotionMark/resources/runner/motionmark.css: Include missing copyright. Wrap the SVG logo in a div and include the version string. * MotionMark/resources/strings.js: Add strings for the page title template, and the version. * MotionMark/resources/debug-runner/motionmark.css: * MotionMark/resources/debug-runner/motionmark.js: (window.benchmarkRunnerClient.willStartFirstIteration): When running the benchmark, pass the benchmark version string to the dashboard, which holds the results. (window.benchmarkController.initialize): Populate all DOM elements with class name "version" with the version string. Update the page title. When dragging in JSON results, look for version to pass to the dashboard. If it doesn't exist, default to "1.0". (window.benchmarkController.showResults): When showing results, update the version string based on what is included in the JSON results, instead of the current benchmark version. * MotionMark/resources/debug-runner/tests.js: Update page title template. Canonical link: https://commits.webkit.org/202238@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@233147 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2018-06-25 15:38:03 +00:00
<div>version <span class="version"></span></div>
Improvements to Animometer benchmark https://bugs.webkit.org/show_bug.cgi?id=157738 Reviewed by Dean Jackson. Provisionally reviewed by Said Abou-Hallawa. Include confidence interval for the final score, and store the canvas size in the serialization so that it is accurately shown in results. * Animometer/developer.html: Add a "confidence" div. * Animometer/index.html: Ditto. Convert "mean" to "confidence". * Animometer/resources/debug-runner/animometer.js: Look at options, and if the configuration is included, update the body class based on it (similar to what we do when we first load the page). That way, if you drag-and-drop previous results in, that configuration is reflected in the dashboard. Show the full confidence interval. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Style update. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: (_processData): Propagate the confidence interval values out and calculate the lower and upper bounds. For now, shortcut the aggregate calculation, since we only go through one iteration. (this._processData.calculateScore): Propagate the confidence interval out to be next to the score. Depending on the controller these values are calculated differently. (this._processData._getResultsProperty): Convenience function. (this._processData.get score): Refactor. (this._processData.get scoreLowerBound): Get the aggregate lower bound. (this._processData.get scoreUpperBound): Get the aggregate upper bound. (window.sectionsManager.setSectionScore): (window.benchmarkController._startBenchmark): When the benchmark starts, note the canvas size and add it to options. That way it will be automatically be serialized. (window.benchmarkController.showResults): Include the maximum deviation percentage. * Animometer/resources/runner/lines.svg: Make the background line up with the skew. * Animometer/resources/runner/tests.js: (Headers.details.text): Refactor. * Animometer/resources/statistics.js: (Statistics.largestDeviationPercentage): Convenience function to calculate the deviation percentage on either end and return the largest deviation. * Animometer/resources/strings.js: Allow specifying a regression profile to use instead of taking the one with the lowest error. Address an issue in the focus test when the regression calculated ends up overestimating the change point, causing a cascade of tougher ramps. The reason behind this is that at max complexity of an initial ramp, the frame length is very high, and it influences the second segment of the piecewise regression strongly, causing it to be very steep. As a result, the first segment, expected to be flat, ends up covering a higher range of complexity. That makes the change point much higher than it should be. To avoid this, we will add a sanity check on the maximum value of the ramp. If the regression's projected value at the maximum complexity of the current ramp is very slow (less than 20 fps), 1) reduce the maximum complexity by 20%, and 2) do not include the regression's change point in the change point estimator. That estimator is used as the midpoint of the next ramp, and including the change point from a poor regression can bake in the error. The controller already knows how to adjust for ramps that are too easy for the system. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Takes a preferred profile and gives that to Regression. (this._processData.calculateScore): With the ramp controller, take the profile of the ramp that was used the most when calculating the ramp's regression. That profile is what is used for the test's score. * Animometer/resources/statistics.js: (Regression.Utilities.createClass): Update to take an options object which can specify a profile to calculate with. Otherwise it will continue to use both and select the one with the lower error. (_calculateRegression): Fix an issue where we claim 0 error if the regression calculation fails due to divide-by-zero. Instead reject that regression calculation by giving it Number.MAX_VALUE. * Animometer/resources/strings.js: New strings for marking a regression as flat or slope. * Animometer/tests/resources/main.js: (RampController): Rename the thresholds for clarity. Add a threshold that, if exceeded, will lower the maximum complexity of the next ramp. (tune): Relax the cdf check to consider whether the interval definitely falls in the desired frame length threshold. (processSamples): Include the profile in the ramp. Update ramp controller test. Increase the length of the test to 30 seconds, and extend the interval to 120 ms during sampling. Improve the estimation of the ramp parameters. * Animometer/developer.html: Change default to 30 seconds, and don't show the progress bar by default. * Animometer/resources/runner/animometer.js: Change default to 30 seconds. * Animometer/tests/resources/main.js: A number of improvements to the ramp controller, in the order in which they appear in the patch: - With a longer test length use longer ramps with longer intervals to get more data at each complexity. Keep the 100 ms interval length during the ramp up phase since we don't need to spend more time there to find the right order of magnitude, but increase it during the ramps to 120 ms. - The ramp linearly interpolates the complexity to render based on its timestamp, but it would never sample the minimum complexity. Instead of lerping max to min complexity from time 0 to t where t is the ramp length, instead lerp from 0 to (t - intervalSampleLength) so that we can have at least one interval sample at the min complexity for that ramp. - Some regression calculations only come out with one line segment rather than the two we expect. This could be due to a noisy ramp or the ramp's range is too narrow. If that's the case, influence the minimum complexity of the next ramp towards the lowest bound of 1, so that we ensure that at least part of the ramp is covering a complexity range that the system can handle at full 60. - Remove an assignment to interpolatedFrameLength since that is never subsequently used. Update the format used to serialize the results for analysis. Each data point used to have named properties for fields like complexity and frame rate. In addition the serialized numbers had rounding errors that took up many characters. Update the format by introducing a new data container called SampleData, which contains a field map. The map maps a string to an array index. Each data point is an array, so, to get a stat, use the field map to get the array index into the data point. This allows future versions to track other data, and reduces the size of the output string by two-thirds. * Animometer/resources/extensions.js: (Utilities.toFixedNumber): Add convenience function that truncates the number to a fixed precision, and converts it back to a number. (SampleData): New class that contains sample data and a field map that maps properties to an array index. (get length): Number of data points. (addField): Add a field to the field map. (push): Add a data point. (sort): Sort the data. (slice): Return new SampleData object with sliced data. (forEach): Iterate over the data with provided function. (createDatum): (getFieldInDatum): Returns the data point associated with the field name by looking it up in the field map in the datum provided, which can be the datum object itself (an array) or an index into the data member variable. (setFieldInDatum): Sets the data point associated with the field name. (at): Returns the data point at the provided index. (toArray): Serializes the data where the field map serves as property names for each point. * Animometer/resources/debug-runner/graph.js: (updateGraphData): Remove unused _testData. Convert the data to the old array format for the graph to use, since the old format was much easier to work with when displaying the graphs. (onGraphTypeChanged): For some controllers, no alternative score or mean is provided. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: Refactor to use SampleData. Update JSON output to only go to 3 digits of precision for purposes of reducing the data size. * Animometer/resources/strings.js: Add new strings to put into the field maps. * Animometer/tests/resources/main.js: Refactor to use SampleData. * Animometer/developer.html: * Animometer/index.html: Restructure results table for both pages. Add charset attribute to tests.js include. * Animometer/resources/debug-runner/animometer.css: Clear out styles from release runner. * Animometer/resources/debug-runner/graph.js: (onGraphTypeChanged): Update score and mean if bootstrap results are available from the controller, since not all controllers do bootstrapping. * Animometer/resources/debug-runner/tests.js: Update header text. * Animometer/resources/runner/animometer.css: Include confidence interval in results. * Animometer/resources/runner/animometer.js: (ResultsTable._addHeader): Header contents can be HTML, so use innerHTML instead. (ResultsTable._addBody): Add tbody element. (ResultsTable._addTest): Allow a data cell to invoke a JS function to get its contents. (window.benchmarkController.showResults): Add table that includes tests' confidence intervals. * Animometer/resources/runner/tests.js: (Headers.details.text): Add new details table that includes bootstrap confidence interval. The interval can be asymmetric, but for simplicity, report the maximum deviation percentage on either side of the bootstrap median. * Animometer/resources/statistics.js: (bootstrap): Include the confidence percentage in the return object. Report canvas size in results. * Animometer/developer.html: Add markup to indicate whether a small, medium, or large canvas was used. * Animometer/index.html: Ditto. * Animometer/resources/debug-runner/animometer.js: Call determineCanvasSize(). * Animometer/resources/runner/animometer.css: Update styles to set the canvas based on the body class size. * Animometer/resources/runner/animometer.js: (window.benchmarkController.initialize): Update styles to set the canvas based on the body class size. (window.benchmarkController.determineCanvasSize): Run various media queries and set the body class based on the size of the device. * Animometer/developer.html: Refactor to include the main CSS file, and redo the layout so that it doesn't rely on flexbox. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: (updateDisplay): Since various parts of the script alter the body class, we can't replace the className directly. Instead, remove all display-based values and then add the one that was selected. * Animometer/resources/debug-runner/graph.js: (updateGraphData): To set the size of the graph, use window.innerHeight. * Animometer/resources/runner/animometer.js: (window.sectionsManager.showSection): Since various parts of the script alter the body class, we can't replace the className directly. Remove all of the section classes individually and then add the one desired. * Animometer/tests/resources/stage.css: Remove -apple-system as a font to use in the stage. Canonical link: https://commits.webkit.org/177081@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@202314 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-06-22 03:11:15 +00:00
<p class="score" onclick="benchmarkController.showDebugInfo()"></p>
<p class="confidence"></p>
<div id="results-tables" class="table-container">
<div>
<table id="results-score"></table>
<table id="results-data"></table>
</div>
<table id="results-header"></table>
</div>
<button onclick="benchmarkController.restartBenchmark()">Test Again</button>
<p>
'j': Show JSON results<br/>
's': Select various results for copy/paste (use repeatedly to cycle)
</p>
Split benchmark into two different pages https://bugs.webkit.org/show_bug.cgi?id=152458 Reviewed by Simon Fraser. Address comments. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): When the testing is complete the frame owning the sampler goes away, and a later call to get the JSON data is no longer available. Process the data right here, instead, and just reference it when displaying the results in ResultsDashboard.prototype._processData. * Animometer/resources/extensions.js: (Array.prototype.fill.Array.prototype.fill): Add a null check. Remove braces around single-line clause. (Array.prototype.find.Array.prototype.find): Update the null check. (ResultsDashboard.prototype._processData): Use the already-processed data. * Animometer/resources/runner/animometer.css: (.frame-container > iframe): Remove calc(). Move Array functions to extensions.js since that is included by the harness. Add ES6 Array polyfills. * Animometer/resources/algorithm.js: (Array.prototype.swap): Moved to extensions.js. * Animometer/resources/extensions.js: (Array.prototype.swap): (Array.prototype.fill): Added. (Array.prototype.find): Added. Adjust styles for iPad. * Animometer/resources/runner/animometer.css: (@media screen and (min-device-width: 768px)): Apply to iPad as well. (@media screen and (max-device-width: 1024px)): Update width for iPads. Adjustment styles for iOS. * Animometer/developer.html: Different divs contain the iframe, so use a class instead and update the style rules. * Animometer/index.html: * Animometer/resources/debug-runner/animometer.css: Remove extraneous rules. (@media screen and (min-device-width: 1800px)): Move this up. * Animometer/resources/runner/animometer.css: Add rules to accomodate iOS. Get rid of prefixed flex properties for now. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/runner/animometer.css: Update the structure of the harness. Remove the JSON-per-test but keep the JSON of the whole test run. Use the full page in order to display the graph. * Animometer/developer.html: Update several of the JS file includes to UTF-8. Remove header and footer. Test results screen includes score, average, and worst 5% statistics. * Animometer/index.html: Make structure similar to developer.html. * Animometer/resources/debug-runner/animometer.css: Remove most of the button rules since they are superfluous. Move the progress bar to the top, fixed. Update the results page rules. * Animometer/resources/debug-runner/animometer.js: Remove most of the additions to sectionsManager since they are no longer needed. (setSectionHeader): Updates header of the section. (window.suitesManager._updateStartButtonState): Update selector. (showResults): Add keypress event for selecting different data for copy/paste. Update how the results are populated. Include full test JSON in a textarea, rather than requiring a button press. (showTestGraph): * Animometer/resources/debug-runner/tests.js: Update structure of Headers. Define different kinds of headers. Headers can control their title, and the text used as the cell contents, including class name. * Animometer/resources/extensions.js: (ResultsTable): Update to include a flattened version of the headers, used while populating table contents. Remove unneeded helper functions for creating the table. Rename "show" to "add". * Animometer/resources/runner/animometer.css: Update rules to accommodate the new structure. * Animometer/resources/runner/animometer.js: (window.sectionsManager.setSectionScore): Helper function to set the score and mean for a section. (window.sectionsManager.populateTable): Helper function to set the table. (window.benchmarkController.showResults): Refactor. (window.benchmarkController.selectResults): Update selectors. * Animometer/resources/runner/tests.js: Set Headers. Debug harness extends it. Update debug runner to have similar names to the basic runner. Include that page's CSS and remove extraneous CSS rules. Get rid of the statistics table #record. * Animometer/developer.html: Rename #home to #intro. Rename .spacer to hr. * Animometer/resources/debug-runner/animometer.css: Set to flexbox when selected. * Animometer/resources/debug-runner/animometer.js: Remove recordTable. (window.suitesManager._updateStartButtonState): Update selector to #intro. (setupRunningSectionStyle): Deleted. * Animometer/resources/runner/animometer.css: (#test-container.selected): Change to flex-box only when visible. Remove recordTable. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): * Animometer/resources/runner/tests.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * Animometer/tests/examples/resources/canvas-electrons.js: * Animometer/tests/examples/resources/canvas-stars.js: * Animometer/tests/misc/resources/compositing-transforms.js: * Animometer/tests/resources/main.js: * Animometer/tests/resources/stage.js: (StageBenchmark): Remove _recordTable. * Animometer/tests/simple/resources/simple-canvas-paths.js: * Animometer/tests/simple/resources/simple-canvas.js: * Animometer/tests/template/resources/template-canvas.js: * Animometer/tests/template/resources/template-css.js: * Animometer/tests/template/resources/template-svg.js: * Animometer/tests/text/resources/layering-text.js: * Animometer/resources/debug-runner/animometer.js: (willStartFirstIteration): Fix selector, since results-table is used in multiple places, so it cannot be an id. Make it possible to select the scores, or the whole table data, by cycling through different selections through key press of 's'. * Animometer/resources/runner/animometer.js: (window.benchmarkController.showResults): Attach a keypress handler if it hasn't been added already. (window.benchmarkController.selectResults): * Animometer/resources/runner/tests.js: Cycle through different ranges. Fix a few fly-by errors. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunnerState.prototype.prepareCurrentTest): Update the frame relative path since the files are now in the top directory instead of inside runner/. (BenchmarkRunner.prototype._runTestAndRecordResults): Incorrect reference to function. (BenchmarkRunner.prototype.step): Member variable is never used. A little stylistic cleanup. * Animometer/resources/debug-runner/benchmark-runner.js: * Animometer/resources/extensions.js: (window.DocumentExtension.createElement): * Animometer/tests/resources/main.js: (Benchmark.prototype.record): * Animometer/tests/resources/stage.js: (StageBenchmark.prototype.showResults): Reverse progress and message. The message appears less frequently than the progress. * Animometer/tests/simple/resources/simple-canvas.js: (SimpleCanvasBenchmark): Remove unused options. Add newer version of harness in a new page. Consolidate differences between the two pages. * Animometer/developer.html: Include runner/animometer.js. Rename the JS function to run the benchmark to startBenchmark() instead of startTest(). Rename #running to #test-container. * Animometer/index.html: Added. Similarly calls startBenchmark() and has #test-container. * Animometer/resources/debug-runner/animometer.css: Make the canvas 2:1 (1200px x 800px) instead of 4:3. Split out benchmarkRunnerClient and benchmarkController. * Animometer/resources/debug-runner/animometer.js: Move needed functions out of benchmarkRunnerClient, and leave the rest here to extend that object. Get rid of _resultsTable and move populating the results table into benchmarkController. Rename _resultsDashboard to results and make it accessible for other objects to use. (willAddTestFrame): This is unnecessary. Remove. (window.sectionsManager.showScore): Grab it from the results object instead of benchmarkRunnerClient. (window.sectionsManager.showSection): Deleted. Moved to runner/animometer.js. (window.benchmarkController._runBenchmark): Deleted. Mostly moved into _startBenchmark. (window.benchmarkController.startBenchmark): Refactor to call _startBenchmark. (window.benchmarkController.showResults): Include most of benchmarkRunnerClient.didFinishLastIteration() here. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._appendFrame): Remove unneeded call to willAddTestFrame. * Animometer/resources/extensions.js: (ResultsDashboard): Change the class to process the sampler data on-demand and hold onto that data for later referencing. (ResultsDashboard.prototype.toJSON): Deleted. (ResultsDashboard.prototype._processData): Rename toJSON to _processData since it's not really outputting JSON. Store the processed data into a member variable that can be referenced later. (ResultsDashboard.prototype.get data): Process the data if it hasn't already. (ResultsDashboard.prototype.get score): Process the data if it hasn't already, then return the aggregate score. (ResultsTable.prototype._showHeader): When outputting the results to a table, don't force the need for an empty children array. This was to allow for a header row in the table that spanned multiple columns. In the simpler harness, this is not needed. (ResultsTable.prototype._showEmptyCells): (ResultsTable.prototype._showTest): This hardcoded the columns. At least for the name and score, which is the bare minimum needed for the simpler harness, key off of the header name provided. * Animometer/resources/runner/animometer.css: Added. Use a similar 2:1 ratio. The score tables are split into the data and the headers, and are also displayed RTL so that a later patch allows a user to copy-paste the data easily. * Animometer/resources/runner/animometer.js: Added. Use a simpler version of benchmarkRunnerClient. The debug harness will extend these classes. (window.benchmarkController._startBenchmark): Used by both harnesses. (window.benchmarkController.startBenchmark): Set hard-coded options. (window.benchmarkController.showResults): Includes most of benchmarkRunnerClient.didFinishLastIteration() here. Get rid of utilities.js. Move it all into extensions.js. * Animometer/resources/extensions.js: * Animometer/tests/resources/utilities.js: Removed. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: Remove script link. * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: Ditto. * Animometer/tests/examples/canvas-electrons.html: Ditto. * Animometer/tests/examples/canvas-stars.html: Ditto. * Animometer/tests/misc/compositing-transforms.html: Ditto. * Animometer/tests/simple/simple-canvas-paths.html: Ditto. * Animometer/tests/template/template-canvas.html: Ditto. * Animometer/tests/template/template-css.html: Ditto. * Animometer/tests/template/template-svg.html: Ditto. * Animometer/tests/text/layering-text.html: Ditto. Split tests.js into two. Add a new suite to runner/tests.js. * Animometer/developer.html: Update the script order. Scripts from debug-runner/ will always build on those from runner/, and have the same name. * Animometer/resources/debug-runner/tests.js: Move "complex examples" suite into "miscellaneous tests". (Suite): Deleted. (Suite.prototype.prepare): Deleted. (Suite.prototype.run): Deleted. (suiteFromName): Deleted. * Animometer/resources/runner/tests.js: Added. Take definitions and functions needed by the test harness. Leave the test suites behind. (Suite): Moved from debug script. (Suite.prototype.prepare): Ditto. (Suite.prototype.run): Ditto. (suiteFromName): Ditto. (testFromName): Ditto. Move benchmark resources out into resources/debug-runner, and update URLs. * Animometer/developer.html: Renamed from PerformanceTests/Animometer/runner/animometer.html. * Animometer/resources/debug-runner/animometer.css: Renamed from PerformanceTests/Animometer/runner/resources/animometer.css. * Animometer/resources/debug-runner/animometer.js: Renamed from PerformanceTests/Animometer/runner/resources/animometer.js. * Animometer/resources/debug-runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/runner/resources/benchmark-runner.js. * Animometer/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/runner/resources/d3.min.js. * Animometer/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/runner/resources/graph.js. * Animometer/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/runner/resources/tests.js. Canonical link: https://commits.webkit.org/170645@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@194407 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2015-12-24 01:31:47 +00:00
</div>
</section>
<section id="test-graph">
Improvements to Animometer benchmark https://bugs.webkit.org/show_bug.cgi?id=157738 Reviewed by Dean Jackson. Provisionally reviewed by Said Abou-Hallawa. Include confidence interval for the final score, and store the canvas size in the serialization so that it is accurately shown in results. * Animometer/developer.html: Add a "confidence" div. * Animometer/index.html: Ditto. Convert "mean" to "confidence". * Animometer/resources/debug-runner/animometer.js: Look at options, and if the configuration is included, update the body class based on it (similar to what we do when we first load the page). That way, if you drag-and-drop previous results in, that configuration is reflected in the dashboard. Show the full confidence interval. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: Style update. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: (_processData): Propagate the confidence interval values out and calculate the lower and upper bounds. For now, shortcut the aggregate calculation, since we only go through one iteration. (this._processData.calculateScore): Propagate the confidence interval out to be next to the score. Depending on the controller these values are calculated differently. (this._processData._getResultsProperty): Convenience function. (this._processData.get score): Refactor. (this._processData.get scoreLowerBound): Get the aggregate lower bound. (this._processData.get scoreUpperBound): Get the aggregate upper bound. (window.sectionsManager.setSectionScore): (window.benchmarkController._startBenchmark): When the benchmark starts, note the canvas size and add it to options. That way it will be automatically be serialized. (window.benchmarkController.showResults): Include the maximum deviation percentage. * Animometer/resources/runner/lines.svg: Make the background line up with the skew. * Animometer/resources/runner/tests.js: (Headers.details.text): Refactor. * Animometer/resources/statistics.js: (Statistics.largestDeviationPercentage): Convenience function to calculate the deviation percentage on either end and return the largest deviation. * Animometer/resources/strings.js: Allow specifying a regression profile to use instead of taking the one with the lowest error. Address an issue in the focus test when the regression calculated ends up overestimating the change point, causing a cascade of tougher ramps. The reason behind this is that at max complexity of an initial ramp, the frame length is very high, and it influences the second segment of the piecewise regression strongly, causing it to be very steep. As a result, the first segment, expected to be flat, ends up covering a higher range of complexity. That makes the change point much higher than it should be. To avoid this, we will add a sanity check on the maximum value of the ramp. If the regression's projected value at the maximum complexity of the current ramp is very slow (less than 20 fps), 1) reduce the maximum complexity by 20%, and 2) do not include the regression's change point in the change point estimator. That estimator is used as the midpoint of the next ramp, and including the change point from a poor regression can bake in the error. The controller already knows how to adjust for ramps that are too easy for the system. * Animometer/resources/runner/animometer.js: (this._processData.findRegression): Takes a preferred profile and gives that to Regression. (this._processData.calculateScore): With the ramp controller, take the profile of the ramp that was used the most when calculating the ramp's regression. That profile is what is used for the test's score. * Animometer/resources/statistics.js: (Regression.Utilities.createClass): Update to take an options object which can specify a profile to calculate with. Otherwise it will continue to use both and select the one with the lower error. (_calculateRegression): Fix an issue where we claim 0 error if the regression calculation fails due to divide-by-zero. Instead reject that regression calculation by giving it Number.MAX_VALUE. * Animometer/resources/strings.js: New strings for marking a regression as flat or slope. * Animometer/tests/resources/main.js: (RampController): Rename the thresholds for clarity. Add a threshold that, if exceeded, will lower the maximum complexity of the next ramp. (tune): Relax the cdf check to consider whether the interval definitely falls in the desired frame length threshold. (processSamples): Include the profile in the ramp. Update ramp controller test. Increase the length of the test to 30 seconds, and extend the interval to 120 ms during sampling. Improve the estimation of the ramp parameters. * Animometer/developer.html: Change default to 30 seconds, and don't show the progress bar by default. * Animometer/resources/runner/animometer.js: Change default to 30 seconds. * Animometer/tests/resources/main.js: A number of improvements to the ramp controller, in the order in which they appear in the patch: - With a longer test length use longer ramps with longer intervals to get more data at each complexity. Keep the 100 ms interval length during the ramp up phase since we don't need to spend more time there to find the right order of magnitude, but increase it during the ramps to 120 ms. - The ramp linearly interpolates the complexity to render based on its timestamp, but it would never sample the minimum complexity. Instead of lerping max to min complexity from time 0 to t where t is the ramp length, instead lerp from 0 to (t - intervalSampleLength) so that we can have at least one interval sample at the min complexity for that ramp. - Some regression calculations only come out with one line segment rather than the two we expect. This could be due to a noisy ramp or the ramp's range is too narrow. If that's the case, influence the minimum complexity of the next ramp towards the lowest bound of 1, so that we ensure that at least part of the ramp is covering a complexity range that the system can handle at full 60. - Remove an assignment to interpolatedFrameLength since that is never subsequently used. Update the format used to serialize the results for analysis. Each data point used to have named properties for fields like complexity and frame rate. In addition the serialized numbers had rounding errors that took up many characters. Update the format by introducing a new data container called SampleData, which contains a field map. The map maps a string to an array index. Each data point is an array, so, to get a stat, use the field map to get the array index into the data point. This allows future versions to track other data, and reduces the size of the output string by two-thirds. * Animometer/resources/extensions.js: (Utilities.toFixedNumber): Add convenience function that truncates the number to a fixed precision, and converts it back to a number. (SampleData): New class that contains sample data and a field map that maps properties to an array index. (get length): Number of data points. (addField): Add a field to the field map. (push): Add a data point. (sort): Sort the data. (slice): Return new SampleData object with sliced data. (forEach): Iterate over the data with provided function. (createDatum): (getFieldInDatum): Returns the data point associated with the field name by looking it up in the field map in the datum provided, which can be the datum object itself (an array) or an index into the data member variable. (setFieldInDatum): Sets the data point associated with the field name. (at): Returns the data point at the provided index. (toArray): Serializes the data where the field map serves as property names for each point. * Animometer/resources/debug-runner/graph.js: (updateGraphData): Remove unused _testData. Convert the data to the old array format for the graph to use, since the old format was much easier to work with when displaying the graphs. (onGraphTypeChanged): For some controllers, no alternative score or mean is provided. * Animometer/resources/runner/animometer.css: * Animometer/resources/runner/animometer.js: Refactor to use SampleData. Update JSON output to only go to 3 digits of precision for purposes of reducing the data size. * Animometer/resources/strings.js: Add new strings to put into the field maps. * Animometer/tests/resources/main.js: Refactor to use SampleData. * Animometer/developer.html: * Animometer/index.html: Restructure results table for both pages. Add charset attribute to tests.js include. * Animometer/resources/debug-runner/animometer.css: Clear out styles from release runner. * Animometer/resources/debug-runner/graph.js: (onGraphTypeChanged): Update score and mean if bootstrap results are available from the controller, since not all controllers do bootstrapping. * Animometer/resources/debug-runner/tests.js: Update header text. * Animometer/resources/runner/animometer.css: Include confidence interval in results. * Animometer/resources/runner/animometer.js: (ResultsTable._addHeader): Header contents can be HTML, so use innerHTML instead. (ResultsTable._addBody): Add tbody element. (ResultsTable._addTest): Allow a data cell to invoke a JS function to get its contents. (window.benchmarkController.showResults): Add table that includes tests' confidence intervals. * Animometer/resources/runner/tests.js: (Headers.details.text): Add new details table that includes bootstrap confidence interval. The interval can be asymmetric, but for simplicity, report the maximum deviation percentage on either side of the bootstrap median. * Animometer/resources/statistics.js: (bootstrap): Include the confidence percentage in the return object. Report canvas size in results. * Animometer/developer.html: Add markup to indicate whether a small, medium, or large canvas was used. * Animometer/index.html: Ditto. * Animometer/resources/debug-runner/animometer.js: Call determineCanvasSize(). * Animometer/resources/runner/animometer.css: Update styles to set the canvas based on the body class size. * Animometer/resources/runner/animometer.js: (window.benchmarkController.initialize): Update styles to set the canvas based on the body class size. (window.benchmarkController.determineCanvasSize): Run various media queries and set the body class based on the size of the device. * Animometer/developer.html: Refactor to include the main CSS file, and redo the layout so that it doesn't rely on flexbox. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/debug-runner/animometer.js: (updateDisplay): Since various parts of the script alter the body class, we can't replace the className directly. Instead, remove all display-based values and then add the one that was selected. * Animometer/resources/debug-runner/graph.js: (updateGraphData): To set the size of the graph, use window.innerHeight. * Animometer/resources/runner/animometer.js: (window.sectionsManager.showSection): Since various parts of the script alter the body class, we can't replace the className directly. Remove all of the section classes individually and then add the one desired. * Animometer/tests/resources/stage.css: Remove -apple-system as a font to use in the stage. Canonical link: https://commits.webkit.org/177081@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@202314 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2016-06-22 03:11:15 +00:00
<div class="body">
<header>
<button onclick="benchmarkController.showResults()">&lt; Results</button>
<h1>Graph:</h1>
<p class="score"></p>
<p class="confidence"></p>
</header>
<nav>
<form name="graph-type">
<ul>
<li><label><input type="radio" name="graph-type" value="time"> Time graph</label></li>
<li><label><input type="radio" name="graph-type" value="complexity" checked> Complexity graph</label></li>
</ul>
</form>
<form name="time-graph-options">
<ul>
<li><label><input type="checkbox" name="markers" checked> Markers</label>
<span>time: <span class="time"></span></span></li>
<li><label><input type="checkbox" name="averages" checked> Averages</label></li>
<li><label><input type="checkbox" name="complexity" checked> Complexity</label>
<span class="complexity"></span></li>
<li><label><input type="checkbox" name="rawFPS" checked> Raw FPS</label>
<span class="rawFPS"></span></li>
<li><label><input type="checkbox" name="filteredFPS" checked> Filtered FPS</label>
<span class="filteredFPS"></span></li>
<li><label><input type="checkbox" name="regressions" checked> Regressions</label></li>
</ul>
</form>
<form name="complexity-graph-options">
<ul class="series">
<li><label><input type="checkbox" name="series-raw" checked> Series raw</label></li>
</ul>
<ul>
<li><label><input type="checkbox" name="regression-time-score"> Controller score</label></li>
<li><label><input type="checkbox" name="bootstrap-score" checked> Bootstrap score and histogram</label></li>
<li><label><input type="checkbox" name="complexity-regression-aggregate-raw" checked> Regression, series raw</label><span id="complexity-regression-aggregate-raw"></span></li>
</ul>
</form>
</nav>
<div id="test-graph-data"></div>
</div>
Split benchmark into two different pages https://bugs.webkit.org/show_bug.cgi?id=152458 Reviewed by Simon Fraser. Address comments. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): When the testing is complete the frame owning the sampler goes away, and a later call to get the JSON data is no longer available. Process the data right here, instead, and just reference it when displaying the results in ResultsDashboard.prototype._processData. * Animometer/resources/extensions.js: (Array.prototype.fill.Array.prototype.fill): Add a null check. Remove braces around single-line clause. (Array.prototype.find.Array.prototype.find): Update the null check. (ResultsDashboard.prototype._processData): Use the already-processed data. * Animometer/resources/runner/animometer.css: (.frame-container > iframe): Remove calc(). Move Array functions to extensions.js since that is included by the harness. Add ES6 Array polyfills. * Animometer/resources/algorithm.js: (Array.prototype.swap): Moved to extensions.js. * Animometer/resources/extensions.js: (Array.prototype.swap): (Array.prototype.fill): Added. (Array.prototype.find): Added. Adjust styles for iPad. * Animometer/resources/runner/animometer.css: (@media screen and (min-device-width: 768px)): Apply to iPad as well. (@media screen and (max-device-width: 1024px)): Update width for iPads. Adjustment styles for iOS. * Animometer/developer.html: Different divs contain the iframe, so use a class instead and update the style rules. * Animometer/index.html: * Animometer/resources/debug-runner/animometer.css: Remove extraneous rules. (@media screen and (min-device-width: 1800px)): Move this up. * Animometer/resources/runner/animometer.css: Add rules to accomodate iOS. Get rid of prefixed flex properties for now. * Animometer/resources/debug-runner/animometer.css: * Animometer/resources/runner/animometer.css: Update the structure of the harness. Remove the JSON-per-test but keep the JSON of the whole test run. Use the full page in order to display the graph. * Animometer/developer.html: Update several of the JS file includes to UTF-8. Remove header and footer. Test results screen includes score, average, and worst 5% statistics. * Animometer/index.html: Make structure similar to developer.html. * Animometer/resources/debug-runner/animometer.css: Remove most of the button rules since they are superfluous. Move the progress bar to the top, fixed. Update the results page rules. * Animometer/resources/debug-runner/animometer.js: Remove most of the additions to sectionsManager since they are no longer needed. (setSectionHeader): Updates header of the section. (window.suitesManager._updateStartButtonState): Update selector. (showResults): Add keypress event for selecting different data for copy/paste. Update how the results are populated. Include full test JSON in a textarea, rather than requiring a button press. (showTestGraph): * Animometer/resources/debug-runner/tests.js: Update structure of Headers. Define different kinds of headers. Headers can control their title, and the text used as the cell contents, including class name. * Animometer/resources/extensions.js: (ResultsTable): Update to include a flattened version of the headers, used while populating table contents. Remove unneeded helper functions for creating the table. Rename "show" to "add". * Animometer/resources/runner/animometer.css: Update rules to accommodate the new structure. * Animometer/resources/runner/animometer.js: (window.sectionsManager.setSectionScore): Helper function to set the score and mean for a section. (window.sectionsManager.populateTable): Helper function to set the table. (window.benchmarkController.showResults): Refactor. (window.benchmarkController.selectResults): Update selectors. * Animometer/resources/runner/tests.js: Set Headers. Debug harness extends it. Update debug runner to have similar names to the basic runner. Include that page's CSS and remove extraneous CSS rules. Get rid of the statistics table #record. * Animometer/developer.html: Rename #home to #intro. Rename .spacer to hr. * Animometer/resources/debug-runner/animometer.css: Set to flexbox when selected. * Animometer/resources/debug-runner/animometer.js: Remove recordTable. (window.suitesManager._updateStartButtonState): Update selector to #intro. (setupRunningSectionStyle): Deleted. * Animometer/resources/runner/animometer.css: (#test-container.selected): Change to flex-box only when visible. Remove recordTable. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._runTestAndRecordResults): * Animometer/resources/runner/tests.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-canvas-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-css-shapes.js: * Animometer/tests/bouncing-particles/resources/bouncing-particles.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-images.js: * Animometer/tests/bouncing-particles/resources/bouncing-svg-shapes.js: * Animometer/tests/examples/resources/canvas-electrons.js: * Animometer/tests/examples/resources/canvas-stars.js: * Animometer/tests/misc/resources/compositing-transforms.js: * Animometer/tests/resources/main.js: * Animometer/tests/resources/stage.js: (StageBenchmark): Remove _recordTable. * Animometer/tests/simple/resources/simple-canvas-paths.js: * Animometer/tests/simple/resources/simple-canvas.js: * Animometer/tests/template/resources/template-canvas.js: * Animometer/tests/template/resources/template-css.js: * Animometer/tests/template/resources/template-svg.js: * Animometer/tests/text/resources/layering-text.js: * Animometer/resources/debug-runner/animometer.js: (willStartFirstIteration): Fix selector, since results-table is used in multiple places, so it cannot be an id. Make it possible to select the scores, or the whole table data, by cycling through different selections through key press of 's'. * Animometer/resources/runner/animometer.js: (window.benchmarkController.showResults): Attach a keypress handler if it hasn't been added already. (window.benchmarkController.selectResults): * Animometer/resources/runner/tests.js: Cycle through different ranges. Fix a few fly-by errors. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunnerState.prototype.prepareCurrentTest): Update the frame relative path since the files are now in the top directory instead of inside runner/. (BenchmarkRunner.prototype._runTestAndRecordResults): Incorrect reference to function. (BenchmarkRunner.prototype.step): Member variable is never used. A little stylistic cleanup. * Animometer/resources/debug-runner/benchmark-runner.js: * Animometer/resources/extensions.js: (window.DocumentExtension.createElement): * Animometer/tests/resources/main.js: (Benchmark.prototype.record): * Animometer/tests/resources/stage.js: (StageBenchmark.prototype.showResults): Reverse progress and message. The message appears less frequently than the progress. * Animometer/tests/simple/resources/simple-canvas.js: (SimpleCanvasBenchmark): Remove unused options. Add newer version of harness in a new page. Consolidate differences between the two pages. * Animometer/developer.html: Include runner/animometer.js. Rename the JS function to run the benchmark to startBenchmark() instead of startTest(). Rename #running to #test-container. * Animometer/index.html: Added. Similarly calls startBenchmark() and has #test-container. * Animometer/resources/debug-runner/animometer.css: Make the canvas 2:1 (1200px x 800px) instead of 4:3. Split out benchmarkRunnerClient and benchmarkController. * Animometer/resources/debug-runner/animometer.js: Move needed functions out of benchmarkRunnerClient, and leave the rest here to extend that object. Get rid of _resultsTable and move populating the results table into benchmarkController. Rename _resultsDashboard to results and make it accessible for other objects to use. (willAddTestFrame): This is unnecessary. Remove. (window.sectionsManager.showScore): Grab it from the results object instead of benchmarkRunnerClient. (window.sectionsManager.showSection): Deleted. Moved to runner/animometer.js. (window.benchmarkController._runBenchmark): Deleted. Mostly moved into _startBenchmark. (window.benchmarkController.startBenchmark): Refactor to call _startBenchmark. (window.benchmarkController.showResults): Include most of benchmarkRunnerClient.didFinishLastIteration() here. * Animometer/resources/debug-runner/benchmark-runner.js: (BenchmarkRunner.prototype._appendFrame): Remove unneeded call to willAddTestFrame. * Animometer/resources/extensions.js: (ResultsDashboard): Change the class to process the sampler data on-demand and hold onto that data for later referencing. (ResultsDashboard.prototype.toJSON): Deleted. (ResultsDashboard.prototype._processData): Rename toJSON to _processData since it's not really outputting JSON. Store the processed data into a member variable that can be referenced later. (ResultsDashboard.prototype.get data): Process the data if it hasn't already. (ResultsDashboard.prototype.get score): Process the data if it hasn't already, then return the aggregate score. (ResultsTable.prototype._showHeader): When outputting the results to a table, don't force the need for an empty children array. This was to allow for a header row in the table that spanned multiple columns. In the simpler harness, this is not needed. (ResultsTable.prototype._showEmptyCells): (ResultsTable.prototype._showTest): This hardcoded the columns. At least for the name and score, which is the bare minimum needed for the simpler harness, key off of the header name provided. * Animometer/resources/runner/animometer.css: Added. Use a similar 2:1 ratio. The score tables are split into the data and the headers, and are also displayed RTL so that a later patch allows a user to copy-paste the data easily. * Animometer/resources/runner/animometer.js: Added. Use a simpler version of benchmarkRunnerClient. The debug harness will extend these classes. (window.benchmarkController._startBenchmark): Used by both harnesses. (window.benchmarkController.startBenchmark): Set hard-coded options. (window.benchmarkController.showResults): Includes most of benchmarkRunnerClient.didFinishLastIteration() here. Get rid of utilities.js. Move it all into extensions.js. * Animometer/resources/extensions.js: * Animometer/tests/resources/utilities.js: Removed. * Animometer/tests/bouncing-particles/bouncing-canvas-images.html: Remove script link. * Animometer/tests/bouncing-particles/bouncing-canvas-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-css-shapes.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-images.html: Ditto. * Animometer/tests/bouncing-particles/bouncing-svg-shapes.html: Ditto. * Animometer/tests/examples/canvas-electrons.html: Ditto. * Animometer/tests/examples/canvas-stars.html: Ditto. * Animometer/tests/misc/compositing-transforms.html: Ditto. * Animometer/tests/simple/simple-canvas-paths.html: Ditto. * Animometer/tests/template/template-canvas.html: Ditto. * Animometer/tests/template/template-css.html: Ditto. * Animometer/tests/template/template-svg.html: Ditto. * Animometer/tests/text/layering-text.html: Ditto. Split tests.js into two. Add a new suite to runner/tests.js. * Animometer/developer.html: Update the script order. Scripts from debug-runner/ will always build on those from runner/, and have the same name. * Animometer/resources/debug-runner/tests.js: Move "complex examples" suite into "miscellaneous tests". (Suite): Deleted. (Suite.prototype.prepare): Deleted. (Suite.prototype.run): Deleted. (suiteFromName): Deleted. * Animometer/resources/runner/tests.js: Added. Take definitions and functions needed by the test harness. Leave the test suites behind. (Suite): Moved from debug script. (Suite.prototype.prepare): Ditto. (Suite.prototype.run): Ditto. (suiteFromName): Ditto. (testFromName): Ditto. Move benchmark resources out into resources/debug-runner, and update URLs. * Animometer/developer.html: Renamed from PerformanceTests/Animometer/runner/animometer.html. * Animometer/resources/debug-runner/animometer.css: Renamed from PerformanceTests/Animometer/runner/resources/animometer.css. * Animometer/resources/debug-runner/animometer.js: Renamed from PerformanceTests/Animometer/runner/resources/animometer.js. * Animometer/resources/debug-runner/benchmark-runner.js: Renamed from PerformanceTests/Animometer/runner/resources/benchmark-runner.js. * Animometer/resources/debug-runner/d3.min.js: Renamed from PerformanceTests/Animometer/runner/resources/d3.min.js. * Animometer/resources/debug-runner/graph.js: Renamed from PerformanceTests/Animometer/runner/resources/graph.js. * Animometer/resources/debug-runner/tests.js: Renamed from PerformanceTests/Animometer/runner/resources/tests.js. Canonical link: https://commits.webkit.org/170645@main git-svn-id: https://svn.webkit.org/repository/webkit/trunk@194407 268f45cc-cd09-0410-ab3c-d52691b4dbfc
2015-12-24 01:31:47 +00:00
</section>
</main>
</body>
</html>