项目作者: overlookerjs

项目描述 :
Frontend performance profiling tool
高级语言: JavaScript
项目地址: git://github.com/overlookerjs/overlooker.git
创建时间: 2019-07-22T14:01:49Z
项目社区:https://github.com/overlookerjs/overlooker

开源协议:Apache License 2.0

下载


overlooker

Build Status npm overlooker package

This package is a set of utilities that allow you to configure your frontend profiling in CI/CD.
At the same time, you will receive comprehensive information about what your pages contain.

Overlooker allows you to run batch profiling of a set of pages for all relevant metrics for the frontend.
You can set the number of measurements for each page,
write scripts that need to be measured for execution speed,
analyze the content of resources on your pages, and much more..

In addition, you can use the received data to compare them with previous profiles,
while receiving a report on what has changed on your pages.

As a tool for performance tests, there is an opportunity to set thresholds
that will limit the degradation of performance on your project and help you track what exactly led to them.

Installation

  1. npm i overlooker

Usage

Configuration (types)

First, let’s figure out how to start profiling.

  1. const config = {
  2. host: 'https://example.com', // profiling host - it will be concatenated for urls of all pages
  3. throttling: { // like a throttling in Chrome DevTools Performance
  4. cpu: 1,
  5. network: 'WiFi'
  6. },
  7. cookies: [{ // an array of cookies to be added for all pages when profiling
  8. name: 'cookie_name',
  9. value: 'cookie_value',
  10. domain: 'example.com'
  11. }],
  12. cache: { // used cache - you can use the built-in wpr binary or use your own proxy
  13. type: 'wpr'
  14. // to use your own proxy:
  15. // {
  16. // type: 'proxy',
  17. // host: string,
  18. // restart?: () => Promise<any>
  19. // }
  20. },
  21. count: 10, // number of profiles for each page
  22. platform: 'desktop', // platform which will be used for profiling (desktop|mobile)
  23. pages: [{ // array of profiling pages
  24. name: 'main',
  25. url: '/',
  26. layers: {
  27. meaningfulLayer: '.selector'
  28. },
  29. actions: [{ // each page can include several scripts that will be executed after the page is loaded
  30. name: 'test-action',
  31. action: async (page) => {
  32. await page.click('button');
  33. await page.waitForSelector('#loaded-image');
  34. }
  35. }]
  36. }, {
  37. name: 'category',
  38. url: '/'
  39. }],
  40. logger: (msg) => console.log(msg), // logger for profiling - it will receive messages during the profiling process
  41. buildData: { // URL or getter to get build data (generated by Bundle Internals Plugin) to assemble complete profiling data
  42. url: '/build.json',
  43. },
  44. requests: { // some functions for filtering requests
  45. ignore: (url) => url.includes('ad'), // ignoring some urls by pattern
  46. merge: (url) => url.includes('stats'), // merge duplicate requests
  47. internalTest: (url) => url.startsWith('https://example.com') || url.startsWith('https://example.io'), // pattern for detecting internal resources
  48. }
  49. };

Most of the parameters are optional and the brief config can be as follows:

  1. const config = {
  2. host: 'https://example.com',
  3. pages: [{
  4. name: 'main',
  5. url: '/',
  6. }],
  7. count: 10
  8. };

To collect some product-centric metrics, you can use standard API on your pages:

Element Timing API

  1. <span elementtiming="some-element-paint"></span>

User Timing API

  1. // to detect User Timing API calls, first set up marks pattern in the profiling configuration
  2. const config = {
  3. // ...
  4. customMetrics: {
  5. timing: /^product-timing\.(.*?)$/i // use string 'all' to collect all timings
  6. }
  7. }

and on your page you have to execute this:

  1. performance.mark('some-metric');

These metrics will be collected during profiling and presented in the resulting json.

Profiling (types)

Profiling can be organized in a simple way.

  1. const { profile } = require('overlooker');
  2. const profileResult = await profile({
  3. ...config,
  4. host: 'https://master.example.com'
  5. });
  6. await db.saveProfile(revision, profileResult);

As a result, you will receive data about the performance of your pages in json format.
I recommend saving the data to your favorite database or directly to the file system (do not forget about rotation) with the identifier of the measured revision.

Impact Analysis (types)

To reduce the cost of profiling and speed it up, I recommend using the impact analyzer on the page.
To use it, you need the previous impact analysis data.

  1. const { impactAnalysis, affectConfigByImpact } = require('overlooker');
  2. const impactData = await impactAnalysis(
  3. masterDescription, // impact data about pages on previous impact analysis
  4. config, // same configuration as for profiling
  5. (element) => element.request.url.includes('ad') // element filtration for collecting stable impact data (for example, you can filter dynamic ad urls)
  6. );
  7. await db.saveImpactData(revision, impactData);
  8. const impactedConfig = affectConfigByImpact(config, impactData);

As a result of executing this code, you will get a configuration with pages that have changes compared to another revision.
Impact analysis data is best saved in a database too.

Comparison (types)

After profiling is over, you can use it to compare against a profile with an earlier revision.
or the revision profile from where your branch you are testing was forked from.

  1. const { comparePages } = require('overlooker');
  2. const profileDataFeature = await db.getProfileByRevision(featureRevision);
  3. const profileDataMaster = await db.getProfileByRevision(masterRevision);
  4. const comparison = comparePages(profileDataMaster, profileDataFeature);

As a result of the comparison, you will get the full difference for all metrics,
requests, modules in chunks (if you used bundle-internals-plugin) and this data
can be used to analyse performance impact of your code changes.

Analyzing the comparison results (types)

There is also a separate method to check the comparison results by custom thresholds.
But first let’s see how to define thresholds for your performance metrics.
That will allow you to set up the deviation limits that you need.

Example of thresholds:

  1. const thresholds = {
  2. default: { // default thresholds (will be used for all pages)
  3. 'percent.stats.userCentric.timeToInteractive.median': 0.05, // path for value in comparison object and limit for deviation
  4. 'percent.stats.elementsTimings.**.median': 0.05 // path with ** covers all nested paths up to keys after this point
  5. },
  6. main: { // threshold for one page, which name is 'main' in profile configuration
  7. 'percent.stats.custom.timings.*.median': 0.1, // path with * symbol covers all keys on this level
  8. 'percent.stats.custom.userCentric.{timeToInteractive, speedIndex}.median': 0.1 // path with {} symbol covers all included keys on this level
  9. }
  10. }

And you can use these thresholds to check your comparison

  1. const { check } = require('overlooker');
  2. const result = check(comparison, thresholds);
  3. if (!result.success) {
  4. return 1; // for example, you can handle the check result and fail the build
  5. }

You can also use this approach to budget performance by using the check method
on some profile.

  1. const { check } = require('overlooker');
  2. const result = check(profileDataFeature, budget);

Tool for getting build data

Bundle Internals Plugin

Tool for view trace events

flame-chart-js