How Your Smart Touch Device Really Works

Jason Joseph
6 min readApr 29, 2020

Almost everyone now owns some sort of smart phone, whether that is an Android or an iPhone. These all include some sort of touch screen which has touch capabilities that help us gesture through where we want to navigate to on our devices. Over the years, the touch screens itself have gotten more advanced because there are now multiple ways to interact with them. One example of this is the 3D touch(which is especially used by Apple) and it offers us different interactive levels to standard glass screens. The technology recognises force as well as gestures, in order to offer more accurate haptic feedback, which in turn results in apps being more accessible thanks to variations in pressure offering previews, quick swiping and even more.

These are programmed in the normal manner with normal languages, like C++, .NET , etc. It uses two layers that are spread apart and touching pushes them together. There are a set of reflectors which reflect signals when touched to form an X, Y Coordinate system. Now how does this all really work though?

Electronic devices can use many different methods to detect a person’s input on a touch screen. Most of them use sensors and circuitry to monitor changes in a particular state. Many, including the iPhone, monitor changes in electrical current. Others monitor changes in the reflection of waves. These can be sound waves or beams of near-infrared light. A few systems use transducers to measure changes in vibration caused when your finger hits the screen’s surface or cameras to monitor changes in light and shadow. The basic idea is pretty simple — when you place your finger or a stylus on the screen, it changes the state that the device is monitoring. In screens that rely on sound or light waves, your finger physically blocks or reflects some of the waves. Capacitive touch screens use a layer of capacitive material to hold an electrical charge; touching the screen changes the amount of charge at a specific point of contact. Capacitive sensors are integrated into the back light a devices retina display. With pressure, microscopic changes between the cover glass and back light are measured, which combined with the touch sensor and accelerometer provides accurate pressure response. In resistive screens, the pressure from your finger causes conductive and resistive layers of circuitry to touch each other, changing the circuits’ resistance.

Most of the time, these systems are good at detecting the location of exactly one touch. If you try to touch the screen in several places at once, the results can be erratic. Some screens simply disregard all touches after the first one. Others can detect simultaneous touches, but their software can’t calculate the location of each one accurately. There are several reasons for this, including the following: Many systems detect changes along an axis or in a specific direction instead of at each point on the screen. Some screens rely on system-wide averages to determine touch locations. Some systems take measurements by first establishing a baseline. When you touch the screen, you create a new baseline. Adding another touch causes the system to take a measurement using the wrong baseline as a starting point.

To allow people to use touch commands that require multiple fingers, smart devices use a different arrangement of existing technology. Its touch-sensitive screen includes a layer of capacitive material, just like many other touch screens. The capacitors are arranged according to a coordinate system. Its circuitry can sense changes at each point along the grid. In other words, every point on the grid generates its own signal when touched and relays that signal to the processor. This allows the phone to determine the location and movement of simultaneous touches in multiple locations. Because of its reliance on this capacitive material, it would only work if you touch it with your fingertip — it won’t work if you use a stylus or wear non-conductive gloves. These types of screens detects touch through one of two methods: Mutual capacitance or self capacitance. In mutual capacitance, the capacitive circuitry requires two distinct layers of material. One houses driving lines, which carry current, and the other houses sensing lines, which detect the current at nodes. Self capacitance uses one layer of individual electrodes connected with capacitance-sensing circuitry. Both of these possible setups send touch data as electrical impulses.

Your smart touch device processor & software are central to correctly interpreting input from the touch screen. The capacitive material sends raw touch-location data to the devices’ processor. The processor uses software located in the phones memory to interpret the raw data as commands and gestures. Here’s what happens:

1.) Signals travel from the touch screen to the processor as electrical impulses.

2.) The processor uses software to analyze the data and determine the features of each touch. This includes size, shape and location of the affected area on the screen. If necessary, the processor arranges touches with similar features into groups. If you move your finger, the processor calculates the difference between the starting point and ending point of your touch.

3.) The processor uses its gesture-interpretation software to determine which gesture you made. It combines your physical movement with information about which application you were using and what the application was doing when you touched the screen.

4.) The processor relays your instructions to the program in use. If necessary, it also sends commands to the devices’ screen and other hardware. If the raw data doesn’t match any applicable gestures or commands, the iPhone disregards it as an extraneous touch.

All these steps happen in a nanosecond — you see changes in the screen based on your input almost instantly. This process allows you to access and use all of the devices’ applications with your fingers.

As you can see there is so much that goes on in the back end for when you do just a simple touch on your smart device. It’s crazy to think how much more in depth future developers could enhance our experience with touch screens. As of right now itself, the technology we have for these devices is extremely high tech. Now, we use our physical fingers but who knows if one day we will all eventually be able to use our mind to merge with AI and interact in even more intuitive ways!

RELATED LINKS

--

--