No matter what smartphone brand you want; We all face difficulties typing things because of the small key on the screen for the keyboard. Research from Eth Zürich can mean fewer errors in typing in the future smartphones. Researchers have developed a new AI solution that allows a reasonable touch screen with eight times higher resolution than the current device.
Their solution can conclude much more precisely where the finger is on the touch screen. The researchers about the project said the challenge by typing in modern smartphones was that touch sensors that detected where the finger was on the screen had not changed much since the mid-2000s. While the sensors have not changed much, the screen itself has increased significantly with more resolution and higher loyalty.
While the latest iPhone screen resolution is 2532 × 1170, the touch sensor has an inferior resolution of 32 × 15 pixels. The capacitive touch screen detects the position of the fingers using the electric field change between the sensor line to feel the closeness of the finger when touching the screen surface.
The team said because capacitive sensing captured closeness, it could not detect true finger contact. The method that appears with a team is called capconnect and combines two approaches. This technology uses a touch screen as an image sensor that can see about eight millimeters and a deep camera that records images of how close objects. CapConnect exploits insight into detecting the contact area accurately between the radius and the surface using a deep learning algorithm built by the team.
The team demonstrates their system with reliably distinguishing touch on the surface even when the finger touches the screen carefully, as in a pinch movement. The researchers believe the AI solution can pave the way for new touch sensing in the future for cellphones and tablets, allowing them to operate more reliably and exactly while reducing traces and complexes of the city sensor manufacturing.