I know this is an old question, but as people move toward the vh unit, this question will become much more common.

To clarify, here's an example of the problem. We have an HTML file that loads an iframe :

<!DOCTYPE html> <html> <head></head> <style> iframe { height: 50vh; width: 100%; } </style> <body> <iframe src="iframe.html"/> </body> </html>

And its iframe :

<!DOCTYPE html> <html> <head></head> <style> div { height: 50vh; width: 100%; background: blue; } </style> <body> <div></div> </body> </html>

The important thing to note here is that both the iframe and the iframe's div element are designated as having a height of 50vh . The intended behaviour may be that the iframe honor the parent context's viewport height or width. Instead, the result looks like this:

That is, the height of the blue element is ~25% of the browser window, instead of the expected 50% (100% of the iframe ). Although we may wish the iframe to respect the viewport of its parent, this example makes a good case for how unintuitive that may be, though it surely would make the v* units more valuable for content being iframe 'd in. The problem has to do with how viewport height is determined.

From the spec:

The viewport-percentage lengths are relative to the size of the initial containing block. When the height or width of the initial containing block is changed, they are scaled accordingly.

Both an iframe and the browser window can be the initial containing block, as they are both valid viewports. A viewport is not limited to the browser window, but instead is defined as a window or other viewing area on the screen through which users consult a document.

An iframe creates a nested browsing context when inserted into a document, and thus is its own viewport.