Main > Everything Else

Technical question about touchscreens.

Pages: (1/2) > >>

leapinlew:

My company develops Biometric software which uses many metrics the iPhone/iPod/iPad touch to assess the identity of the user.

My task has been to try and find out if the touchscreen on these devices is sensitive enough to get any kind of fingerprint. Our software can accurately identify a fingerprint with only a partial print at 160dpi. After much searching and discussion, I am still at a loss. I have a few questions and was hopeful someone here could help answer.

1. How does the screen respond to a touch on the screen? Does it sense the capacitance difference between the valley/ridges of a finger print?

2. When I was reviewing capacitance fingerprint scanners, it appears to be the same technology. After a few conversations with touchscreen manufacturers, it seems the technology is very similar but a phone touchscreen has much fewer electrodes and uses interpolation and different to determine where on the screen is being pressed.
 
I know these are broad questions, but I'm having a difficult time even trying to figure out the right questions to ask. The basic question I have is:

Does a iDevice have enough sensitivity in the screen to pull a rough fingerprint image? If not, why not? If anyone could help me answer or point me in the right direction, I would greatly appreciate it. I've been searching for a while and haven't been able to locate a satisfactory answer.

Mysterioii:

If it has fewer sensors and uses interpolation to make an educated guess as to where you're touching, that would explain why my droid razr maxx seems to do a crappy job of identifying which link I'm trying to click or which key I'm trying to press on the on-screen keyboard.   :badmood:  It could just be that the software isn't making the most of the hardware's capabilities, but based on my user experience I'd be shocked if this thing would have the capability to do what you're looking for.

kahlid74:

My understanding is the technology used does not allow information like that to be gathered because it is only concerned with gathering position, not "what" is touching it.  My guess is you could hack one to do what you're looking for but the iOS may not have hooks in it capable of doing such natively.

drventure:

First, I don't know ANYTHING about this particular area, so take my comments with a grain of salt.

Everything I've found tends to imply that the conversion of "touch" to x-y happens in hardware, well before you can get your hands on the raw data.

But I did find this link that talks about touchscreen kernel level filters being able to distinguish between finger, thumb, thumbnail and stylus.

http://wiki.openmoko.org/wiki/Touchscreen_Filters

Might be a place to start.

I googled "raw touchscreen data"

knave:

I also have no education on how the data is handled, but from replacing smartphone screens I know that the digitizer sticks on the back of the touchscreen glass and connects with a ribbon cable to the phone hardware. Weather it gust reads x, y coordinates or is capable of sensing texure is the real question. My theory is that it reads when our touch interrupts the flow od current through the digitizer and then plots the coordinates from that. This would suggest that what you want to do is not possible. How do common biometric fingerprint scanners read the fingerprints? Light? Hmm...

A dongle might be an option...

...or find a way to use the camera...

Pages: (1/2) > >>

Go to full version