|
Friday, June 23, 2006 |
Multipoint touchscreen!! |
Looks like an old news but it didn't catch my eyes until couple of days back. Steve Jobs and his Apple continue to inspire me!!
Apple files for new touch screen, media file patents
Apple Computer filings published on Thursday show the iPod maker to be working on multipoint touch screens.
A filing with the United States Patent and Trademark Office made on May 6, 2004 and published for the first time on Thursday describes a "multipoint touchscreen" that relates to "a touch screen capable of sensing multiple points at the same time."
According to the filing, the touch screen is comprised of a pixilated array of transparent capacitance sensing nodes and would appear as a transparent panel that is positioned in front of the display.
"Unlike conventional touch screens, however, the touch screen shown herein is configured to recognize multiple touch events that occur at different locations on the touch sensitive surface of the touch screen at the same time," the filing reads. "That is, the touch screen allows for multiple contact points to be tracked simultaneously, i.e., if four objects are touching the touch screen, then the touch screen tracks all four objects."
"The multiple objects may for example correspond to fingers and palms," the filing continues. "Because the touch screen is capable of tracking multiple objects, a user may perform several touch initiated tasks at the same time. For example, the user may select an onscreen button with one finger, while moving a cursor with another finger. In addition, a user may move a scroll bar with one finger while selecting an item from a menu with another finger. Furthermore, a first object may be dragged with one finger while a second object may be dragged with another finger. Moreover, gesturing may be performed with more than one finger." |
posted by kart @ 3:27 AM |
|
8 Comments: |
-
I don't know da... some how I get an instinct that this would pave way for another new tech revolution... 'Simulation of touch' !!
Hmmm... day isn't far when we guys touch Latetia Casta and Salma Hyke in the 'touch screen'?! Am I not a weird guy?!
-
@Prad - As we kow, the conventional GUI based Apps run on event based mechanism to detect and handle mouse events. If it is multi touch then the challenge is handling concurrency. But I guess existing pseudo parallelisms is sufficient to handle the concurrent events. No big deal as long as the OS who monitors the mouse braces itself to generate concurrent events. Which is again no big deal for the same reason. We are already handling concurrency in some form, like keyboard and mouse click already, taking advantage of the fact that even if the events are really concurrent, the delay in handling would go unnoticed by the user.
-
@Prad: Adding to what Sunil said..
Technically speaking, Windows Events handling mech is very simple, 1) There is a WinProc which is nothing a while loop looking for events and posts them as messages to various applications.
2) Application inturn have their own message loops, which gets the message, translates it and dispatches it to the appropriate window.
3) There are already ques which are capable of handling concurrent messages.
Now, you need to perform a specific mapping for the multi-touch events in case it is necessary, just like how ctrl key presses are handled by windows (that's exactly what the TRANSLATE_MESSAGE is for).
No big deal in Windows!! Not sure abt Linux or MAC event handling mechanism!!
-
@Karthik- In my understanding, to handle concurrent messages they use multithreaded architecture. But in this case, a same thread working on a same object (an image for example), doing the same kind of operation (say, resize) need to handle concurrent messages. But Pradeep pointed out that unless the application has been modelled to handle concurrency, you may face problems. But as opposed to concurrent message queues which you mentioned we can have single queue as in existing applications but the OS has to take care of handling the concurrency. For example if you are resizing the image using your four fingers, it should generate a drag event at four different points and the key is, it should handle all these events in round robin. It shouldn't give more priority to say the dragging of left top corner. Given that the number of input events or the number of points you will be touching is infinite, the OS has a problem.
I believe we can't get into any sane discussion without knowing the kind of interface the multi touch screen provides. It may serialize the parallel inputs making life easier for OSs which also rules out using conventional mouse drivers. Nevertheless, well pointed out Pradeep!
-
Moms... what happens when you press multiple key strokes and then simultaniously use mouse?!
1) Each key stroke is an event. 2) When you move mouse pointer by an inch, 100-s of mouse_move events are fired.
How are these handled? Isn't the application able to handle it? Do you find any time noticeable delay?
Though they happen in parallel, you could process them serially and still achieve concurrency?! Ain't this analogous to the way OS uses a single processor and simulates as if multiple processes are running parallely (only diff is, WinProc is just a round robin, non pre-emptive scheduling algo unlike the OS algos].
PS: I agree that the problem will occur only when an event handler takes lion's share of CPU time!! This is a valid scenario even in the normal case.
-
Oops... WinProc is just FIFO, non-pre-emptive scheduling alog. Mentioned it wrongly.
I can think of another case where the normal application would fail, For egs: At the moment, touch screen events are handled as mouse events. Say, mouse_up event of one point/touch should not be paired with mouse_down of the some other point/touch. So, the application has to be re-designed to handle this ?!
-
I also initially tried to perceive this as analogous to multiprocessing. But the difference is, there are lot of sophistications like time slicing and scheduling added to the design of a multiprocessed system and further more, the processor itself was designed to support it. That level of sophistication can't be added to a device driver since it has very few and standard ways of interfacing, as interrupt mechanisms or memory mapping. This forces us to think how the device will communicate multiple events to the processor, forget about how the OS handles it. If the device serializes the OS can simply handle it as other serial devices. (This is the intutional design which ould also be the fastest.) The mistake in our analysis was we tried to map a multi touch monitor with a mouse. But these are two different paradigms of Graphical User Interfacing! For example how many pointers do you plan to have on your screen? infinite? We try to look at drag click events and start discussing about arbitration. Why can't the language that a multi touch will speak be something else instead of co ordinates and clicks? What if there could be a configurable limitation to the number of points it should handle concurrently? The answer to all these - we don't know the design or interface.
FYI: Key Board - There is no concurrency. The keyboard controller itself handles checking the status of Alt, Shift, Ctrl and other keys and sends a scan code. You don't get separate notification for Ctrl and C for example. In case of multiple key strokes at sametime, I don't find any concurrency issue! The keyboard gets you the first key's keydown and other key presses are simply ignored. Mouse - Well there may be infinite events happening when you just move a mouse pointer but they happen serially. I could see no concurrency involved. Mouse and Keyboard at the same time: You have separate event handlers for each and they run in parallel rather pseudo parallel. So the concurrency goes down to multi processing in this case. But the multi touch throws up numerous questions. There are mutliple events occuring concurrently in the same device! I couldn't think of any such device that is analogous to this!
-
Missed the discussion as i dint get time to sit at my desk today. Was running around, getting some fucking 10 year old hardware into working condition and then connect it with a brand new hardware and make the 2 speak to each other :( sunil's got it perfectly correct. The present device drivers serialise interrupts and the OSs are not expected to wor with parallel events - purely in terms of GUI. There are lready situations in which a device will have to handle simultaneous interrupts - a good example will be IO handling. For all the IO sent to a device, the Bus device in the host will receive multiple interrupts on IO completion. But since this need not all be a real time operation (meaning these can wait) the h/w uses queues to put these interrupts on wait. Really speaking, there are queues (h/w and s/w) involved in many levels which really introduce a lot of lag in the life of a IO. But it all goes unnoticed coz the processors (CPU and the IO processor) are faster than the storage devices. Okie - guess i got carried away with my own stuff :)
The point i am trying to make is that this queueing stuff (which was the first thing that hit my mind) wont work with GUI based h/w coz he change has to be immediate. It is also obvious that we cannot have infinite such sense points - there has to be a limit. From where i see it, this limit will be h/w enforced - we will only have as many touch points as the device can handle (as many interrupt sources the device can afford). Also - there can be all kind of fundoo stuff done, like a special protocol using which the GUI device will create packets or stream of packets saying co-ordinate movements and the GUI device simply streaming these packets onto the proc, which makes the necesary stuff. This seems pretty much the way to go coz the kind of events we have today is very limited and hence it will be simple to set sucha protocol and gradually expand that into new kind of events later on. It is always possble to bump up that GUI hardware and make that a bit more powerful to do all this as fast as real time. And yeah, we are living in the days of GPUs where the CPU isnt really involved in all this shit :)
This is all the h/w engineers problem and i guess this will take some time settling, during which various ppl will come together, set up work groups or task forces and release V1.0 and V2.0 of the spec :) But the really interesting aspect of this whole thing is what Pradip has pointed to. This introduces a whole new paradigm shift in the mind of the application or rather the GUI s/w developer. Till today - we have a keyboard to type and a 3 button device to click. Multiple entry points - opens a big huge window for creativity to flow :)
|
|
<< Home |
|
|
|
|
Previous Post |
|
Blogroll |
|
Archives |
|
|
|
I don't know da... some how I get an instinct that this would pave way for another new tech revolution... 'Simulation of touch' !!
Hmmm... day isn't far when we guys touch Latetia Casta and Salma Hyke in the 'touch screen'?! Am I not a weird guy?!