holistic javascript performance

37
HOLISTIC PERFORMANCE John Resig

Upload: jeresig

Post on 20-Aug-2015

1.072 views

Category:

Technology


0 download

TRANSCRIPT

HOLISTIC PERFORMANCE

John Resig

Performance

Performance analysis is amazingly complex

There is no, single, silver-bullet

Don’t want to compromise quality in favor of performance

Also want to communicate the changes in a realistic way

Analyzing PerformanceWall-clock time

Time in different browsers

CPU consumption

Memory consumption

Memory leaks

Bandwidth consumption

Parse time

Battery consumption (Mobile!)

Dictionary Lookups in JavaScript

An interesting example for looking at performance.

Most frequent concern: File Size

Many solutions only optimize for file size

Disregard parse time, or other performance aspects

Naïve Solution

Pull in a raw list of words

Push it into an object for fast property lookups

Uses a lot of file size

Very fast lookups

Trie

A compact structure forstoring dictionaries

Optimizes heavily for filesize

Can be rather expensiveto parse

Can also use a lot of memory

0KB

275KB

550KB

825KB

1100KB

Plain String Binary String Simple Trie Optimized Trie Suffix Trie Succinct Trie

Normal Gzipped

File Size of Dictionaries

0ms

37.5ms

75ms

112.5ms

150ms

Plain String Binary String Hash Trie Succinct Trie

Load Speed of DictionariesTime to load the dictionary once in Node.js on a 2.8 GHz Core i7.

0ms

1.5ms

3ms

4.5ms

6ms

Plain String Binary String Hash Trie Succinct Trie

Search Speed of Dictionaries

Found Missing

Time to look up one word.

0MB

2.75MB

5.5MB

8.25MB

11MB

Plain String Binary String Hash Trie Succinct Trie

Private Memory Usage of DictionariesAfter loading the dictionary once.

dynaTrace

dynaTrace

One of the best tools available for analyzing the full browser stack

Dig into CPU usage, bandwidth usage, and even performance of browser-internal methods

Works in both IE and Firefox

Practical Performance

Think about the larger context

Pre-optimization is dangerous

Code quality

Importance

Cross-browser compatibility

Performance in thejQuery Project

Rule 1: Prove it.

Prove it.

Any proposed performance optimization must be undisputedly proven.

Show us the proposed changes and how it’ll affect performance across all platforms.

How? JSPerf.

http://jsperf.com/

JSPerf

JSPerf is a great tool

Makes it very easy to build a reproducible test:

http://jsperf.com/valhooks-vs-val/2

JSPerf

JSPerf builds on some of the earlier analysis I did in 2008

http://ejohn.org/blog/javascript-benchmark-quality/

Runs tests the maximum number of times in 5 seconds

Even does optimization to make sure there is less loop overhead

Also uses a Java Applet for even better timer accuracy

Rule 2: See the Big Picture.

See the Big Picture.

Micro-optimizations are death.

Doesn’t matter how much you unroll a loop if that loop is doing DOM manipulation.

Most crippling web app performance is from DOM performance issues.

Pure JS performance is rarely an issue.

Prove the use case.

If you’re proposing an optimization you must prove what it’ll help.

Show real world applications that’ll benefit from the change.

This is especially important as it’ll help stop you from wasting time on performance issues that don’t matter.

Rule 3: Clean Code.

Clean Code.

We won’t compromise our code quality in exchange for performance.

Almost all code quality compromises come from needless micro-optimizations.

~~(1 * string) vs. parseInt( string )

+new Date vs. (new Date).getTime()

Don’t even get me started on loop unrolling.

Rule 4: Don’t Slow IE.

Don’t Slow IE.

Just because performance gets better in one browser doesn’t mean it’ll get faster in all browsers.

You shouldn’t compromise performance in other browsers for the sake of one.

(Unless that browser is IE, always improve IE performance.)

Communicating the Results

Creating realistic tests

Communicating in an effective manner

Creating Realistic Tests

Realism

It’s incredibly hard to create realistic test cases

It’s important to look at actual applications

We frequently use Google Code Search to find out how people are using our APIs

(This gives us the knowledge that we need when we want to deprecate an API as well.)

Communicating the Results

Browserscope

Collection of performance results

Organized by browser

JSPerf plugs right in

Creating Results

Pull the results directly from BrowserScope

Best: Compare old versions to new versions

Within the context of all browsers

.val() (get)

0

175000

350000

525000

700000

Chrome 11 Safari 5 Firefox 4 Opera 11 IE 7 IE 8 IE 91.5.2 1.6

(Number of test iterations, higher is better.)

Competition

You might be inclined to compare performance against other frameworks, libraries, applications, etc.

This tends to create more problems than it’s worth

And the comparison isn’t always one-to-one

If competing, agree on some tests first

Work with your competition to create realistic tests

Compete Against Yourself

In the jQuery project we work to constantly improve against ourselves

Every release we try to have some performance improvements

Always compare against our past releases

Rewriting API internals is a frequent way of getting good performance results

More Information

Thank you!

http://ajax.dynatrace.com/ajax/en/

http://jsperf.com

http://www.browserscope.org

http://ejohn.org/blog/javascript-benchmark-quality/

http://ejohn.org/