From MozillaWiki

The Janus proxy experiment has ended.

The code remains on github, but we are no longer actively developing it.

























Overview

Janus is a compression and privacy proxy with the goal to provide more secure and efficient mobile browsing. The goals section gives some details on our objectives and the ways we want to achieve them.

The Janus Proxy is currently using the experimental SPDY protocol but will use the upcoming new HTTP/2 standard when it is finalized.

The proxy does compress and re-encode media files using techniques described in the features section.

Communication Channels

Chat

You can reach the developers of the proxy on the #janus channel on the Mozilla IRC network.

Mailing List

For asynchronous discussions, use the janus-dev mailing list.

Bug Reports

The easiest way to report bugs is using the Report Bug feature on the Janus add-on. That would report your client settings and the additional information you provide to give us enough context to reproduce the issue. Alternatively, you can directly create a GitHub issue to report the problem you've encountered. Please make sure to provide explicit steps to reproduce the issue and also note your client configuration and browser version.

Janus Proxy

The Janus Proxy source code is hosted on GitHub. We are using GitHub issues to track bugs and features.

Firefox (for Android)

The Janus Proxy requires secure SPDY (soon HTTP/2) proxy support on the browser, which are supported on Firefox 33 and up. Currently (August 2014), the supported Firefox versions are served on the Aurora or Nightly channels.

Required or desired features that Janus could utilize are tracked here

bug 378637 - Add support for connecting to HTTP proxy over HTTPS

bug 366559 - Firefox/Gecko should support LZMA as an HTTP transfer-encoding method

bug 1010068 - Disable OCSP on Firefox for Android

bug 944117 - Implement support for WebM Alpha

Known issues affecting Janus are tracked here

bug 1014589 - Fennec crashes on page load when connected with SPDY proxy

bug 1047485 - HTTPS sites slow when SPDY proxy in use

bug 1025582 - CORS request intermittently fails after refreshing page rapidly

Janus Add-On

To make the configuration of the proxy easier, we provide the Janus add-on for your Firefox (for Android) browser. The add-on settings give you also more control over the desired behavior of the proxy like compression levels and give you access to some experimental features.

Firefox OS

We are also working on providing support for the proxy on Firefox OS, the progress is being tracked in bug 1041389.

Goals

The main goal of the Janus Proxy is to provide a better mobile browsing experience by enhancing privacy, reducing the download sizes of web content and therefore decrease load times on slow connections.

Here are our objectives:

Increase user privacy by encryption

Reduce bandwidth requirements by data compression

Reduce page load times on slow connections using efficient transmission protocols

The potential advantages for users are

Encryption of traffic on insecure WiFi networks

Increased mileage on limited data plans

Faster browsing on slow connections

Features

To achieve the goals, we discuss the features of the Janus Proxy and their implementation statuses in this section.

Status





Open Issues (suitable for contributors)

If you want to contribute to the project and don't know where to start, this list should give you an idea of what we are currently working on or planning to. Alternatively, you may also directly pick a feature to improve or an issue to attack.

Issue Description Mentors See also HTTP/2 Migrate to HTTP/2, adjust handling, test for regressions snorp, esawin gh node-janus/15 node-http2 Add-on Update to HTTPS PAC, fix/remove scrollbar on Linux, add quality control settings snorp, esawin gh node-janus/67 janus-addon Image worker Fix image-worker issues, rewrite to TCP/HTTP interface, enable per-request settings sylvain, esawin gh janus-image-worker/2

Transmission Protocol

Routing requests through a SPDY proxy does require less open TCP connections (multiplexing), decreases packet sizes (header compression) and reduces the number of packets (header caching) compared to HTTP/1.1. Low bandwidth and high latency connections should benefit from this.

Image Compression

The current trend on the Web shows an increase in average page size, especially due to high-resolution images. Additional image compression and downsizing should decrease the bandwidth requirements and enable faster page loads. We want to reduce images sizes with no noticeable quality degradation, or with a dynamic quality setting controlled by the user.

Text Compression

We use gzip and possibly xz. TODO

Caching

TODO

Safe Browsing

TODO

Other

Other crazy stuff we could (technically) do:

Man-in-the-middle HTTPS traffic only for images (so they get recompressed) with user consent

Convert animated GIF to H264/WebM/whatever. gfycat.com does this now with apparently good results

Pre-Shumwayize Flash content

Automatic readability mode for some sites

Add support for adaptive streaming (MPEG-DASH) for servers that don't support it. Transcode to lower bitrates. Proxy HLS sites as MPEG-DASH

Pre-rendering of pages

Taras crazy stuff:

http://zsync.moria.org.uk/ content that changes a little bit, but frequently

Can also do this for web cut/n/paste. Server could insist on serving zsync summaries of common files(eg css libs, js libs like jquery) and client would do a distance search locally to see if it has anything similar and then only do range requests needed to construct the new version.

Currently we send if-modified-since: We could add header supports-zsync:

This would make server reply with a .zsync which the client would then use to submit range: requests.

asset bundling/inlining for cold loads server can determine when client got nothing in cache(eg no if-modified-since), do spdy-push from a preset profile server could spdy-push partial files(eg only css that gets used) + zsync of rest of contents so client can request missing pieces for other pages on the website(this can hugely reduce transfer sizes)...If i remember correctly from https://bugzilla.mozilla.org/show_bug.cgi?id=834865 70%+ of css is leftover cruft...Note this might be easier to do having sites be aware of this and bundling a library that supports having css/js broken up into multiple files & supports waiting on missing pieces



Research

Image Compression

Image Formats

Format Lossy Transparency support Supported by Firefox JPEG Yes No Yes JPEG XR Yes Yes No JPEG 2000 Yes Yes No PNG No Yes Yes JNG Yes Yes No WebP Yes Yes No BMP No Yes Yes

Sample

The sample used for the tests below was made using images from 20 different webpages from the Alexa's top 200 most visited websites list.

Format Number of images Total size (MB) JPEG 325 12 PNG 817 14 GIF 325 1.8

PNG Compression

The PNG sample size is 12383KB.

Compression Method New sample size (kB) Average compression (%) Total time (s) optipng -o 1 11560 7 70 optipng -o 5 10940 12 481 pngquant 5169 59 69 pngcrush 10997 12 90 pngcrush + optipng -o 5 10280 17 541 WebP (quality = 90) 4741 62 47 WebP (quality = 100) 8321 33 48

JPEG Compression

The sample size is 10846kB.

Compression Method New sample size (kB) Average compression (%) Total time (s) mozjpeg 9663 11 53 mozjpeg (quality = 90) 8571 21 - jpegoptim 10497 4 39 WebP (quality = 100) 17551 -61 42 WebP (quality = 90) 8316 24 40

GIF Compression

The sample size is 846kB.

Compression Method New sample size (kB) Average compression (%) Total time (s) gifsicle 805 5 16

Animated GIF Compression

It is possible to convert animated GIFs to WebM videos (as gfycat.com does), this can be done with a straightforward ffmpeg command. The processing time can be high but the size reduction is huge!

Some stats from a small set of 10 animated GIFs:

Original sample size (kB) New sample size (kB) Average compression (%) Total time (s) Median time (s) 25162 1177 95 25 2.46

Image size Time (s) 531K 0.301 844K 0.719 992K 0.847 1.1M 2.726 1.3M 2.199 1.5M 2.831 1.7M 1.497 3.5M 5.223 4.5M 2.777 8.0M 6.604

Common Misconceptions

Losing radio link means losing TCP connection

The PGW terminates the TCP/UDP connections, application level connectivity is not tied to the physical radio link.

A device session is tied to an IP

The PGW provides NAT services, a device can be mapped to multiple IP/port combinations or multiple devices can share the same IP.

Bandwidth is the decisive factor for page load speed

Bandwidth gives the upper bound on transmission speed under optimal conditions. In realistic environments, considering the speed of light and the shortcomings of the transmission protocol, latency is the limiting factor for the maximum throughput.

Performance Analysis

Let's collect some ideas here how to profile the prototype.

Could be useful: