Even though the televisions in our homes are bigger and higher quality than ever before, many of us can still have trouble reading subtitles or seeing fine details in the images on the screen.

But electronics giant Samsung is aiming to stop viewers from ever having to squint at their sets again with intelligent technology that adapts if it sees them struggling.

Using eye-tracking and facial recognition software, Samsung's future TVs could detect if a viewer is having to screw up their eyes or leaning towards the screen to see what is on it.

Although modern televisions are bigger and better quality than ever, many people still struggle to see what is being displayed on screen clearly. Samsung has patented technology that can detect if someone is squinting (stock picture above) or struggling to see what is on the screen and adjust the image automatically

It will then automatically zoom in to the area they are looking at on the display, or change the size of text in subtitles until it can be comfortably seen.

The system could also change the speed at which text is displayed on the screen or alter the brightness.

WHY SQUINTING CORRECTS VISION When the eye looks at something, light enters the pupil and photosensitive cells within the retina create an image. This is then sent to the optic nerve in the brain. Within the retina is a small, central ‘pit’ made of closely packed cones called the fovea, responsible for sharp and central vision. As people age, they can lose the ability to focus light properly and some objects may appear blurry. The amount of light coming in can also pose a problem, as it arrives from different angles and must be focused together. Squinting overcomes blurriness by doing two things - changing the shape of your eye and the amount of light entering it. Advertisement

The technology has been revealed in a new patent application submitted by Samsung.

Writing in the patent, the company said: 'The display is reconfigured automatically, or the device requests the user to confirm whether or not the display should be reconfigured.

'In some embodiments, an accessibility menu screen in the device's user interface is shown to allow the user to manually select new display settings.

Examples of ways in which the display is reconfigured include enlarging the size of images on the display, increasing the display brightness, and changing the font type, size, colour, or weight.'

The patent is the latest attempt by Samsung to turn its televisions into smart devices.

They already have technology that can detect gestures and voice recognition.

Samsung has also included eye-tracking in its S4 smartphones, which automatically scrolls down a page as a user reads, and pauses content if they look away.

With the help of cameras on the front of the television, it now hopes to add facial recognition and eye tracking technology to its televisions.

Samsung's patent explains that the technology uses facial recognition software to watch for the size of the viewer's eyes (illustrated) to see if they are having difficulty seeing the screen, and eye-tracking to identify where on the screen they are looking

The system would be able to detect if the viewer is wearing glasses and how they are coping with the images shown on the screen.

It would also look for changes in the size of the viewer's eyes that would indicate they are having trouble seeing what is on the screen.

The technology would then automatically adjust what is on the screen to make it easier for the user to see (illustrated)

If they grow larger it suggests the screen is not bright enough, while if the eyes grow smaller they are probably squinting.

Eye-tracking would also help the system identify where on the screen the person is looking, so if they are struggling to see text at the bottom of the screen this could then be adjusted.

Samsung said that the system would also combine the eye data with information about the ambient lighting, helping it determine if the screen is suffering from glare at a particular part of the screen.

This would mean this particular area could be boosted without effecting the rest of the image.

The system would also measure if the person has moved closer to the screen, indicating they are leaning forward to better see what is being displayed.

The system assesses whether the viewer is having difficulty seeing what is on the screen by producing what it calls a 'visual acuity score' and then adjusts it accordingly.

If there are several people viewing at once, it could also produce the best solution for all of the viewers.

However, Samsung said the technology could also help to alert viewers to eye problems.

It said: 'Various further actions can be taken in response to a determination that the user is experiencing difficulty visually resolving the content.

For example, the value of the facial parameter is logged and uploaded to a healthcare server to be monitored by a physician.