Git Product home page Git Product logo

web-page-replay's Introduction

Build Status Coverage Status

DEPRECATION NOTICE

This tool is now deprecated & is replaced by WebPageReplayGo

For the rationale, please see: https://bit.ly/wpr-go-project

Web Page Replay

Record live Web pages and use them for local performance testing!

How?

Use local DNS and HTTP(S) proxies to captures your live traffic. Then use these captures in order to replay the same exact content, making sure that your tests get consistent results, that are not affected by the origin servers, the network, etc.

Tell me more

Check out the getting started guide or take a look at the architecture diagram.

Also see Note about web-page-replay code

I want to help

If you find issues with the project, you can file issues on this repo. If you want to do more and contribute code to help the project evolve, check out our contribution guidelines.

web-page-replay's People

Contributors

andrey-malets avatar chasephillips avatar colin-scott avatar dayoung-shin avatar eakuefner avatar hubertwu avatar jbudorick avatar jeremymos avatar maryruthven avatar mmohabey avatar nedn avatar nickie avatar pavolas avatar rmcloughlin avatar sleevi avatar stevelamm avatar tonygentilcore avatar wangzhen127 avatar yoavweiss avatar zwri avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

web-page-replay's Issues

Replay uses wrong root CA during replay

Replay wants to be "man-in-the-middle" for HTTPS traffic on Android.
Our current approach is to install a dynamically generate cert onto the phone as a root CA.

However, when replayiny, replay fails to sign certs with the dynamically generated root CA. Instead, it uses the root CA stored in the archive at record time.

Replay should not store the root CA in the archive.

Whitelist domains using rule

I've had to patch httproxy like this to enable using real fetches on matching URLs in replay mode. Is there a reason this can't be done using a rule? The only example of a rule is for logging. I created a rule to modify the response and it didn't do anything.

real_dns_lookup = dnsproxy.RealDnsLookup(
  name_servers=[platformsettings.get_original_primary_nameserver()],
  dns_forwarding=True, proxy_host='0.0.0.0', proxy_port=53)
real_fetch = httpclient.RealHttpFetch(real_dns_lookup)
response = real_fetch(request)

SSL Certificate installation for iOS devices?

Hi there,

I am able to install the provided .pem file on my iOS device. It shows up as an installed profile on an iPhone. But replayed SSL sites still pop up the "Cannot verify server identity" dialog in MobileSafari.

I feel like I am missing something simple. Any way to get around this dialog? It breaks automation I've built.

Thanks,
Sid

web-page-replay's record mode is busted on Mac

Log:
(WARNING) 2016-03-14 20:37:02,419 httpclient.call:346 Retrying fetch GET https://www.google.com/complete/search?client=chrome-omni&gs_ri=chrome-ext-ansg&xssi=t&q=g&oit=1&cp=1&pgcl=2&gs_rn=42&psi=tOj_mXYfcWoT6cNy&sugkey=AIzaSyBOti4mM-6x9WDnZIjIeyEU21OpBXqWBgw [('accept-encoding', 'gzip,deflate'), ('host', 'www.google.com')]: CertificateError("hostname '216.58.193.100' doesn't match 'www.google.com'",)

Temporary fix:
Do not translate from host name to ip address in https://github.com/chromium/web-page-replay/blob/master/httpclient.py#L289 to avoid CertificateError

Measuring page load times

How is the page load time measured after replaying request from archive.wpr?

I see there is a chrome extension and app engine app to manage the results. But the chrome extension throws warning on Chrome 40.0.2214.94 (64-bit) OSX related to manifest version.

Is there a way to measure page load time without extensions? It would allow testing on other browsers also.

Also, on Yosemite I'm getting AssertionError: Failed to find ipfw in path.

Thanks

Re-using recorded URLs can cause infinite redirection loops

Reproduction steps:

  1. Run third_party/webpagereplay/replay.py --port 0 --ssl_port 0 --no-dns_forwarding --use_closest_match -r foo.wpr
  2. Run /opt/google/chrome/chrome --no-default-browser-check --no-first-run --ignore-certificate-errors --host-resolver-rules="MAP * 127.0.0.1,EXCLUDE localhost" --testing-fixed-http-port=HTTPS_PORT --testing-fixed-https-port=HTTP_PORT --no-proxy-server --user-data-dir=$(mktemp -d) in a separate terminal with HTTP_PORT and HTTPS_PORT taken from the output of the first command.
  3. Go to https://accounts.google.com/ServiceLogin?continue=https%3A%2F%2Faccounts.google.com%2FManageAccount and log in.
  4. Go to https://mail.google.com/. The load will fail with the following error message: "mail.google.com redirected you too many times.".
  5. Go to https://mail.google.com/?bar. The load will succeed (sometimes the resulting page is completely broken, but that's irrelevant here).

The problem here is that the first https://mail.google.com/ navigation creates a loop:

https://mail.google.com/https://accounts.google.com/ServiceLogin?passive=true&continue=https://mail.google.com → ...more redirections... → https://mail.google.com/

Instead of retrieving https://mail.google.com/ from the server again at the end of the loop above, WPR reuses its stored action, which is to redirect to https://accounts.google.com/ServiceLogin?passive=true&continue=https://mail.google.com, thus creating an infinite loop.

Script injection broken by 1362f49

Here's how to reproduce this:

  1. On one shell, run:
./replay.py --port=4080 --ssl_port=4443 --no-dns_forwarding --record foo.wpr
  1. On a second shell, run:
timeout 10 $MY_FAVORITE/chrome --ignore-certificate-errors \
  --host-resolver-rules="MAP *:80 localhost:4080, MAP *:443 localhost:4443, EXCLUDE localhost" \
  'https://www.google.de/search?q=v8'
  1. Ctrl-C the replay.py command on the first shell.

At this point, you can verify that some CHUNK_BOUNDARY are (probably) not being properly handled by running:

./httparchive.py cat foo.wpr | grep BOUNDARY | head -2
  1. Prepare a simple script to be injected. Run:
cat > inj.js <<EOF
(function() {
  window.addEventListener("load", function() {
    alert("Running the injected script!");
  });
})();
EOF
  1. On the first shell, run:
./replay.py --port=4080 --ssl_port=4443 --no-dns_forwarding --inject_scripts=deterministic.js,inj.js --use_closest_match foo.wpr
  1. On the second shell, run:

    $MY_FAVORITE/chrome --ignore-certificate-errors \
      --host-resolver-rules="MAP *:80 localhost:4080, MAP *:443 localhost:4443, EXCLUDE localhost" \
      'https://www.google.de/search?q=v8'
    

At step 6, you should see the alert message "Running the injected script!", but you will not.

If you repeat step 6 after you have checkout out commit d23e5a8, it will work as expected. The commit that revealed the problem (I believe, not the one that caused it) is 1362f49.

google-chrome --ignore-certifcate-errors does nothing

It looks like web-page-replay is broken on any page that uses HSTS. The --ignore-certifcate-errors does nothing, an error message is still displayed about NET::ERR_CERT_AUTHORITY_INVALID. This causes network requests to fail, even with a manual override of typing badidea.

WPR certificate problems when using WPR to record a Chrome Telemetry pageset

When @nedn tried to rerecord the Chrome Telemetry v8_infinite_scroll_ignition page set from the Chromium src/ directory with

./tools/perf/record_wpr v8_infinite_scroll_ignition --browser=android-system-chrome --story-filter=discourse

(with an Android Galaxy S5 or Nexus 5 connected) he got an error that looked like this:

screen

Any idea what might be going wrong?

Unsuccessfully trying to record https://facebook.com

Command being used:

$ ./replay.py --record archive.wpr --should_generate_certs
I've also tried:
$ ./replay.py --record archive.wpr --no-ssl
but it does not load the page. It seems I need to add a certificate:
https://cl.ly/2H1m2j1n0U1c

I would have used pyOpenSSL==0.13.0 but it does not build Mac 10.12 (MacOS Sierra)

(wpr) armenzg@armenzg-mbp web-page-replay$ pip freeze
appdirs==1.4.3
asn1crypto==0.22.0
cffi==1.10.0
cryptography==1.8.1
enum34==1.1.6
idna==2.5
ipaddress==1.0.18
packaging==16.8
pycparser==2.17
pyOpenSSL==17.0.0
pyparsing==2.2.0
six==1.10.0

(ERROR) 2017-04-24 14:03:41,986 sslproxy._SetUpUsingDummyCert:61 Dropping request without SNI
(ERROR) 2017-04-24 14:03:41,987 sslproxy.handle_servername:51 Exception in SNI handler: [('SSL routines', 'SSL_shutdown', 'shutdown while in init')]
Exception in thread Thread-55:
Traceback (most recent call last):
File "/usr/local/Cellar/python/2.7.13/Frameworks/Python.framework/Versions/2.7/lib/python2.7/threading.py", line 801, in __bootstrap_inner
self.run()
File "/usr/local/Cellar/python/2.7.13/Frameworks/Python.framework/Versions/2.7/lib/python2.7/threading.py", line 754, in run
self.__target(*self.__args, **self.__kwargs)
File "/usr/local/Cellar/python/2.7.13/Frameworks/Python.framework/Versions/2.7/lib/python2.7/SocketServer.py", line 599, in process_request_thread
self.handle_error(request, client_address)
File "/Users/armenzg/repos/web-page-replay/httpproxy.py", line 419, in handle_error
_HandleSSLCertificateError()
File "/usr/local/Cellar/python/2.7.13/Frameworks/Python.framework/Versions/2.7/lib/python2.7/SocketServer.py", line 596, in process_request_thread
self.finish_request(request, client_address)
File "/usr/local/Cellar/python/2.7.13/Frameworks/Python.framework/Versions/2.7/lib/python2.7/SocketServer.py", line 331, in finish_request
self.RequestHandlerClass(request, client_address, self)
File "/usr/local/Cellar/python/2.7.13/Frameworks/Python.framework/Versions/2.7/lib/python2.7/SocketServer.py", line 650, in init
self.setup()
File "/Users/armenzg/repos/web-page-replay/sslproxy.py", line 82, in setup
_SetUpUsingDummyCert(self)
File "/Users/armenzg/repos/web-page-replay/sslproxy.py", line 63, in _SetUpUsingDummyCert
raise certutils.Error('SSL handshake error %s: %s' % (host, str(v)))
Error: SSL handshake error www.facebook.com: [('SSL routines', 'tls_post_process_client_hello', 'no shared cipher')]
Exception in thread Thread-56:
Traceback (most recent call last):
File "/usr/local/Cellar/python/2.7.13/Frameworks/Python.framework/Versions/2.7/lib/python2.7/threading.py", line 801, in __bootstrap_inner
self.run()
File "/usr/local/Cellar/python/2.7.13/Frameworks/Python.framework/Versions/2.7/lib/python2.7/threading.py", line 754, in run
self.__target(*self.__args, **self.__kwargs)
File "/usr/local/Cellar/python/2.7.13/Frameworks/Python.framework/Versions/2.7/lib/python2.7/SocketServer.py", line 599, in process_request_thread
self.handle_error(request, client_address)
File "/Users/armenzg/repos/web-page-replay/httpproxy.py", line 419, in handle_error
_HandleSSLCertificateError()
File "/usr/local/Cellar/python/2.7.13/Frameworks/Python.framework/Versions/2.7/lib/python2.7/SocketServer.py", line 596, in process_request_thread
self.finish_request(request, client_address)
File "/usr/local/Cellar/python/2.7.13/Frameworks/Python.framework/Versions/2.7/lib/python2.7/SocketServer.py", line 331, in finish_request
self.RequestHandlerClass(request, client_address, self)
File "/usr/local/Cellar/python/2.7.13/Frameworks/Python.framework/Versions/2.7/lib/python2.7/SocketServer.py", line 654, in init
self.finish()
File "/Users/armenzg/repos/web-page-replay/sslproxy.py", line 86, in finish
self.connection.shutdown()
File "/Users/armenzg/venv/wpr/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1715, in shutdown
self._raise_ssl_error(self._ssl, result)
File "/Users/armenzg/venv/wpr/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1378, in _raise_ssl_error
_raise_current_error()
File "/Users/armenzg/venv/wpr/lib/python2.7/site-packages/OpenSSL/_util.py", line 54, in exception_from_error_queue
raise exception_type(errors)
Error: [('SSL routines', 'SSL_shutdown', 'shutdown while in init')]

Tool for converting HAR to WPR format?

I have some HAR files that I would like to replay to my browser. Are there any existing tools for converting HAR to WPR's format, so that I can do the replay?

Fix injected Date at the date it was recorded

The current deterministic implementation of Date is fixed at "Feb 29 2008 02:26:08 GMT+0000". This causes problems for webpages that use the current date to set the expiry date of a cookie. Since the date is so far back, the cookie is usually immediately discarded, which is unexpected behavior and can lead to problems such as infinite refresh loops:

  1. Run third_party/webpagereplay/replay.py --port 0 --ssl_port 0 --no-dns_forwarding --use_closest_match -r foo.wpr
  2. Run /opt/google/chrome/chrome --no-default-browser-check --no-first-run --ignore-certificate-errors --host-resolver-rules="MAP * 127.0.0.1,EXCLUDE localhost" --testing-fixed-http-port=HTTPS_PORT --testing-fixed-https-port=HTTP_PORT --no-proxy-server --user-data-dir=$(mktemp -d) in a separate terminal with HTTP_PORT and HTTPS_PORT taken from the output of the first command.
  3. Go to http://m.vk.com/. The page keeps refreshing infinitely.

In the example above, the following infinite loop happens:

  1. Load http://m.vk.com/
  2. Check if cookie exists (it doesn't)
  3. Write cookie (with the deterministic date provided by WPR 2008 + 1 month)
  4. Refresh
  5. Check if cookie exists (it's 2016 so it already expired)
  6. Write cookie (with the deterministic date provided by WPR 2008 + 1 month)
  7. Refresh
  8. ...

Proposal: Use the recording date as the injected deterministic date. Note that this will require storing the recording date in the generated WPR files.

How to kill web-page-replay subprocess properly?

Hey,

I'm writing a script to programmatically start and then shutdown web-page-replay.

I'm having difficulty properly shutting down web-page-replay. In particular, when I:

  • spawn a web-page-replay subprocess
  • send SIGTERM to the web-page-replay process
  • call wait() on the killed subprocess's pid

it appears that web-page-replay subprocess does not exit properly. As a result, /etc/resolv.conf is still set to 127.0.0.1.

Calling wait() should, in theory, clean up all of web-page-replay's children. But, apparently this isn't happening?

I also tried sending SIGTERM to web-page-replay's process group, but this didn't work either.

What's the proper way to shutdown a web-page-replay subprocess?

Thanks!
-Colin

Option to replay original timings?

Is there a way to tell the WPR proxy to replay each response with the same network delays as observed in the original execution? e.g., if a request for CSS took 151ms to get a response, the replay proxy would wait 151ish ms before sending the recorded response.

ipfw is too course-grained, as it applies delays uniformly to all responses.

Infinite redirected chains because of httparchive.ArchivedHttpRequest._TrimHeaders()

Hi there,

We ran into an issue when running Chromium's Sandwich project over a larger set of URLs: a fair amount of them have the following pattern when navigating to the given http://foo.bar/ URL:

  1. Clean user cookies
  2. Request to http://foo.bar/ redirect to http://foo.bar/load_balancer;
  3. Request to http://foo.bar/load_balancer redirect to http://foo.bar/ with Set-Cookie;
  4. Request to http://foo.bar/ to actually have the page content.

The current implementation of httparchive.ArchivedHttpRequest returns the recorded response of the request 1 at the request 3, resulting to an infinite redirect page loading failure in Chrome.

More in details, the reason why WPR do this is because httpclient.RecordHttpArchiveFetch.__call__ do a request in self.http_archive with request a httparchive.ArchivedHttpRequest and self.http_archive a httparchive.HttpArchive.

But HttpArchive subclasses dict (only cow...) and ArchivedHttpRequest re-implement __hash__ and __eq__ that use __repr__ only considering a subset of request headers computed by httparchive.ArchivedHttpRequest._TrimHeaders and this later function prune out the Cookie request header among others. This leads to a hash collision of the ArchivedHttpRequest with different Cookie. As a result RecordHttpArchiveFetch believe that despite the different Cookie header, the response of this request have already been recorded and therefore just serve this recorded response, creating this infinite redirection bug explained above.

I have not tried to reproduced when not recording the WPR to go in the httpclient.ReplayHttpArchiveFetch's code path because I was blocked because of WPR recording failures first, but I suspect the same issue. Because ReplayHttpArchiveFetch.__call__ exactly do the same: response = self.http_archive.get(request).

Mac OS X requirements

@yoavweiss : Does web-page-replay work strictly with Mac OS X 10.6 only? What about OS X versions up until 10.9 (that ship with the ipfw binaries)?

I'm seeing discrepancies between the behavior of web-page-replay on OS X 10.9 and Ubuntu 14.04 ( #56 ).

P.S. I pointed towards you since I saw your name against the commit for the README.md. Sorry if I'm asking you the wrong question. Please feel free to point this question to anyone who might be able to answer. Thanks!

Difference between normal mode and server mode "-M"

This is not really an issue but a general question to understand what is happening under the hood. Sorry my networking background isn't that great.

@nedn Could you give an overview of some of the technical differences between web-page-replay in the server-mode (using the -M option) and web-page-replay in the normal mode (NOT using the -M option)

I see that, in the server mode, the DNS sits at the IP address of the machine and the HTTP servers sit at 0.0.0.0. In the normal mode, the DNS and HTTP servers sit at 127.0.0.1. What is the difference? Also, in the server mode, I had to change the DNS of my machine to the IP address of my machine.

Problems in Network Throttling

I was not able to get the network throttling feature to work on any modern version of Linux or MacOS. Specifically, I was not able to install dummynet on these operating systems. Other people also seem to be having the same problem
Is there any other workaround for using the network throttling feature / instructions on how to install dummynet?
Thanks!

error : unable to shape traffic

After this command
sudo ./replay.py --up 128KByte/s --down 4Mbit/s --delay_ms=100 archive.wpr

an error is coming of Unable to shape traffic . Plz share data regarding upspeed, downspeed , delay so that i am able to shape the traffic .

replay servers out-dated HTTP headers

The "date", "expires", and "last-modified" headers are served unchanged from when they were recorded.

It would make more sense to update "date" to the present, and update "expires" and "last-modified" relative to that.

certutils.get_host_cert does not work behind proxy

Another place where recording https archives through proxy is not working yet.

I have a patch that uses httpclient.get_connection (made public) to get a connection and use that connection to retrieve server certificate. This resolves the proxy issue because get_connection() is proxy aware.

cert = connection.sock.getpeercert(True)
cert = ssl.DER_cert_to_PEM_cert(cert)

What do you think?

Deterministic script is sometimes injected into non-html files

How to reproduce:

  1. Run third_party/webpagereplay/replay.py --port 0 --ssl_port 0 --no-dns_forwarding --use_closest_match -r foo.wpr
  2. Run /opt/google/chrome/chrome --no-default-browser-check --no-first-run --ignore-certificate-errors --host-resolver-rules="MAP * 127.0.0.1,EXCLUDE localhost" --testing-fixed-http-port=HTTPS_PORT --testing-fixed-https-port=HTTP_PORT --no-proxy-server --user-data-dir=$(mktemp -d) in a separate terminal with HTTP_PORT and HTTPS_PORT taken from the output of the first command.
  3. Go to http://9gag.com.
  4. Open the Developer Tools (press F12 or Ctrl+Shift+I).
  5. Go to the Console tab. It contains the following SyntaxError ("Unexpected token <"):

screenshot from 2016-05-11 16 31 25

The problem is that WPR injected the deterministic script into a JSON response:

<script>(function () {  var random_count = 0; ... })();</script>data({"okay":true,"result":[{"title":"My sleep life in a comic.","url":"http:\/\/9gag.com\/gag\/a1Mj92v","itemId":"13698","imageURL":"http:\/\/miscmedia-9gag-fun.9cache.com\/images\/featured\/1462858529.3221_rUSEhe_300.jpg"}, ... ]});

This happens because the content type of the response is text/html (rather than application/json).

While we the general problem of determining whether a script should be injected or not is undecidable, I think that we should add a heuristic that only injects a script if the file contains at least one <tag>.

WPR server fetch certificate from live network during replay mode

From catapult-project/catapult#2953

During replay:

ssl_proxy._SetUpUsingDummyCert.handle_servername() invokes handler.server.get_certificate()

handler.server.get_certificate(host))

httpproxy.get_certificate() invokes http_archive.get_server_cert()

def get_certificate(self, host):

http_archive.get_server_cert() invokes certutils.get_host_cert(host) if the certificate does not exist already:

self[request] = create_response(200, body=certutils.get_host_cert(host))

certutils.get_host_cert(host) makes the connection to live host to fetch the certificate:

def get_host_cert(host, port=443):

SSLv3 alert handshake failure

How to reproduce:

  1. Run third_party/webpagereplay/replay.py --port 0 --ssl_port 0 --no-dns_forwarding --use_closest_match -r foo.wpr
  2. Run /opt/google/chrome/chrome --no-default-browser-check --no-first-run --ignore-certificate-errors --host-resolver-rules="MAP * 127.0.0.1,EXCLUDE localhost" --testing-fixed-http-port=HTTPS_PORT --testing-fixed-https-port=HTTP_PORT --no-proxy-server --user-data-dir=$(mktemp -d) in a separate terminal with HTTP_PORT and HTTPS_PORT taken from the output of the first command.
  3. Go to https://i1.sndcdn.com/artworks-000139618215-ubm1i6-t500x500.jpg.

The page fails to load ("404 Not Found") and replay.py generates the following output:

2016-05-11 10:51:40,076 DEBUG RealHttpFetch: i1.sndcdn.com /artworks-000139618215-ubm1i6-t500x500.jpg
2016-05-11 10:51:40,427 WARNING Retrying fetch GET https://i1.sndcdn.com/artworks-000139618215-ubm1i6-t500x500.jpg [('accept-encoding', 'gzip,deflate'), ('host', 'i1.sndcdn.com'), ('upgrade-insecure-requests', '1')]: [Errno 1] _ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure
2016-05-11 10:51:40,443 WARNING Retrying fetch GET https://i1.sndcdn.com/artworks-000139618215-ubm1i6-t500x500.jpg [('accept-encoding', 'gzip,deflate'), ('host', 'i1.sndcdn.com'), ('upgrade-insecure-requests', '1')]: [Errno 1] _ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure
2016-05-11 10:51:40,460 WARNING Retrying fetch GET https://i1.sndcdn.com/artworks-000139618215-ubm1i6-t500x500.jpg [('accept-encoding', 'gzip,deflate'), ('host', 'i1.sndcdn.com'), ('upgrade-insecure-requests', '1')]: [Errno 1] _ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure
2016-05-11 10:51:40,476 CRITICAL Could not fetch GET https://i1.sndcdn.com/artworks-000139618215-ubm1i6-t500x500.jpg [('accept-encoding', 'gzip,deflate'), ('host', 'i1.sndcdn.com'), ('upgrade-insecure-requests', '1')]: [Errno 1] _ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure
2016-05-11 10:51:40,476 WARNING Failed to find response for: GET https://i1.sndcdn.com/artworks-000139618215-ubm1i6-t500x500.jpg [('accept-encoding', 'gzip,deflate'), ('host', 'i1.sndcdn.com'), ('upgrade-insecure-requests', '1')] (401ms)
2016-05-11 10:51:40,556 DEBUG RealHttpFetch: i1.sndcdn.com /favicon.ico
2016-05-11 10:51:40,573 WARNING Retrying fetch GET https://i1.sndcdn.com/favicon.ico [('accept-encoding', 'gzip,deflate'), ('host', 'i1.sndcdn.com')]: [Errno 1] _ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure
2016-05-11 10:51:40,592 WARNING Retrying fetch GET https://i1.sndcdn.com/favicon.ico [('accept-encoding', 'gzip,deflate'), ('host', 'i1.sndcdn.com')]: [Errno 1] _ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure
2016-05-11 10:51:40,608 WARNING Retrying fetch GET https://i1.sndcdn.com/favicon.ico [('accept-encoding', 'gzip,deflate'), ('host', 'i1.sndcdn.com')]: [Errno 1] _ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure
2016-05-11 10:51:40,628 CRITICAL Could not fetch GET https://i1.sndcdn.com/favicon.ico [('accept-encoding', 'gzip,deflate'), ('host', 'i1.sndcdn.com')]: [Errno 1] _ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure
2016-05-11 10:51:40,628 WARNING Failed to find response for: GET https://i1.sndcdn.com/favicon.ico [('accept-encoding', 'gzip,deflate'), ('host', 'i1.sndcdn.com')] (73ms)

Accessing the same image via http (http://i1.sndcdn.com/artworks-000139618215-ubm1i6-t500x500.jpg) works absolutely fine:

2016-05-11 10:52:18,407 DEBUG RealHttpFetch: i1.sndcdn.com /artworks-000139618215-ubm1i6-t500x500.jpg
2016-05-11 10:52:18,456 DEBUG Recorded: GET http://i1.sndcdn.com/artworks-000139618215-ubm1i6-t500x500.jpg [('accept-encoding', 'gzip,deflate'), ('host', 'i1.sndcdn.com'), ('upgrade-insecure-requests', '1')]
2016-05-11 10:52:18,457 DEBUG Served: GET http://i1.sndcdn.com/artworks-000139618215-ubm1i6-t500x500.jpg [('accept-encoding', 'gzip,deflate'), ('host', 'i1.sndcdn.com'), ('upgrade-insecure-requests', '1')] (50ms)
2016-05-11 10:52:18,539 DEBUG RealHttpFetch: i1.sndcdn.com /favicon.ico
2016-05-11 10:52:18,645 DEBUG Recorded: GET http://i1.sndcdn.com/favicon.ico [('accept-encoding', 'gzip,deflate'), ('host', 'i1.sndcdn.com')]
2016-05-11 10:52:18,646 DEBUG Served: GET http://i1.sndcdn.com/favicon.ico [('accept-encoding', 'gzip,deflate'), ('host', 'i1.sndcdn.com')] (108ms)

What is sudo needed for?

My apologies for using the issue tracker for asking a question but I could not find a mailing list or an IRC channel. Are there any channels available besides the issue tracker?

I read in this page that sudo is needed for "root permission required to connect to port 80/53"

Would it be possible to change the settings to need those ports?

I'm looking to run wpr in automation (Windows, Mac & Linux) and I don't know if I have the ability to run as a privileged user.

Bug with script injection due to minification

If you try injecting the following script:

var url = "http://google.de/";

MinifyScript will treat "//" as a comment and the injected script will not work. Minification only happens when reading the injected scripts from files, so the tests will not catch this. I'm uploading a CL to fix this.

"su -c CMD" syntax doesn't work in Android M

The su command requires its first argument to be an integer in Android M:

https://android.googlesource.com/platform/system/extras/+/lollipop-release/su/su.c#43 (Android L)
https://android.googlesource.com/platform/system/extras/+/marshmallow-release/su/su.c#39 (Android M)

Therefore, the AndroidCertInstaller._adb_su_shell() method needs to execute su 0 args instead of su -c args if the device is running Android M (see https://code.google.com/p/chromium/codesearch#chromium/src/build/android/devil/android/device_utils.py&l=320).

This issue causes the following Chromium Telemetry failure:

(ERROR) 2016-02-16 17:36:08,660 network_controller_backend.InstallTestCa:139  Failed to install test certificate authority on target platform. Browsers may fall back to ignoring certificate errors.
Traceback (most recent call last):
  File ".../chromium/src/third_party/catapult/telemetry/telemetry/internal/platform/network_controller_backend.py", line 135, in InstallTestCa
    self._platform_backend.InstallTestCa(self._wpr_ca_cert_path)
  File ".../chromium/src/third_party/catapult/telemetry/telemetry/internal/platform/android_platform_backend.py", line 547, in InstallTestCa
    self._device_cert_util.install_cert(overwrite_cert=True)
  File ".../chromium/src/third_party/catapult/telemetry/third_party/webpagereplay/adb_install_cert.py", line 163, in install_cert
    raise CertInstallError('Cert Install Failed')
CertInstallError: Cert Install Failed

404 not found during record mode

OS: OS X 10.11 & Ubuntu 14.04
Command:

sudo ./replay.py --record ~/archive.wpr

Whenever I try to open a web page, I see the "404 not found" page on the browser. I'm using http://www.ehow.com instead of https://www.ehow.com.

I googled the solution to this but wasn't able to find anything concrete. This thread shows the problem has been fixed but I'm not seeing it work on my machine. Anything extra I need to do apart from simply cloning the repo on my Mac?

Dummynet issue

The suggested Dummynet installation steps don't work - I'm getting an error when I download the source and run make:

error: argument to ‘sizeof’ in ‘memset’ call is the same expression as the destination; did you mean to dereference it? [-Werror=sizeof-pointer-memaccess]
bzero(fs, sizeof(fs)); /* safety */
^
/home/vrege/Downloads/ipfw3-20120610/dummynet2/missing.h:139:34: note: in definition of macro ‘bzero’
#define bzero(s, n) memset(s, 0, n)

Mac Vs Ubuntu behavior discrepancy with Servo

I'm using web-page-replay to test out Servo (https://github.com/servo/servo).

On Mac OS X 10.9:
I'm having issues only in the replay mode. I record web-pages using Servo and when I try to replay them, web-page-replay works well for a few requests. After a random number of requests, Servo is able to load web pages that weren't recorded. The behavior is the same even after setting the network preferences on the Mac to use a "proxy" sitting at 127.0.0.1:80 which is where the web-page-replay's HTTP server listens.

On Ubuntu 14.04: The web-page-replay setup works perfectly with Servo. There are no issues whatsoever. Without using proxy settings, I'm able to record and replay the right pages.

What is possibly going on?

Webpage does not replay correcly

Steps to reproduce:

  1. Apply https://codereview.chromium.org/2162473002 or wait until it is landed.
  2. ./tools/perf/record_wpr --extra-browser-args=--disable-notifications --chrome-root=$(pwd) v8_desktop_browsing_benchmark --browser=stable --story-filter=facebook
  3. ./tools/perf/run_benchmark v8.browsing_desktop --browser=stable --device=desktop --story-filter=facebook

Expected: the replay does the same actions as recording.
Actual: the replay gets stuck in "about:blank" on the second navigation

Drivers available for win 10 ?

I know this tool has experimental support for windows, but do you know if drivers are available for windows 10? I tried the installation steps, and trying to install service with 'have disk' option shows a message saying 'Unable to find any drivers for this device'.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.