1. This forum section is a read-only archive which contains old newsgroup posts. If you wish to post a query, please do so in one of our main forum sections (here). This way you will get a faster, better response from the members on Motherboard Point.

Tremor compensation, revisited

Discussion in 'Embedded' started by D Yuniskis, Apr 23, 2011.

  1. D Yuniskis

    D Yuniskis Guest

    Hi,

    I've been pouring over early data regarding my
    initial attempts at "suppressing" the effects of
    tremor (Parkinsonian, ET, etc.) in the application
    of pointing devices (N.B. this is wrt their use
    *as* pointing devices).

    Of course, The Cynic expected mixed results -- and
    was not disappointed! :<

    It seems that the frequency and magnitude vary with the
    type of pointing device. I suspect a consequence of
    how much "mechanical damping" is introduced in the
    use of the device itself (more == less effect).

    Effects seem to be more significant "East-West" than
    "North-South". OTOH, N-S seems to exhibit signs of
    "stiction" (for want of a better word... maybe an
    increased "dead-band", of sorts?).

    Unfortunately, the frequency domain of the tremors seems
    too close to that of the actual "pointer motion" (that
    wants to be preserved). It seems like this gap should
    increase with age as general mobility decreases (?).

    Individual preference and pointing device characteristics
    seem to effect the actual orientation of these axis, though.
    So, filtering must be capable of rotating the native data
    coordinate system to align itself with the physical motions
    of the user.

    This is relatively easy to do in real-time -- since the
    orientation remains constant, once identified.

    This brings me to the point of my post:

    Since I can't "automagically" observe and remove signs of
    tremor in the actual data stream, I need a means by which
    I can "calibrate the user". In a simplistic sense, maybe
    something like: "move pointing device from point A to
    point B" and have the tool analyze the actual trajectory
    followed. From that, deduce appropriate time constants
    for the filters (??)

    I suspect I can just treat "from" and "to" points on the
    trajectory as the theoretical "perfect" path and watch
    the oscillations/deviations around this?

    Thx,
    --don
     
    D Yuniskis, Apr 23, 2011
    #1
    1. Advertisements



  2. This reminds me somewhat of touchscreen calibration routines.

    Hmmmm.... I recall an individual with parkinson's that could only walk
    with a homemade device located 'under his nose' (so to speak) such
    that some moving black and white blocks on a motorized band were in
    the lower part of his view. Of course, this is for the 'freezing'
    symptom. Those folks seem to need visual cues. You can find Youtube
    videos related to that.

    Also, many folks have the 'pill rolling'/'watch winding' tremor
    symptom(around 4-6 'beats' per sec). maybe if the effect of that
    tremor on mouse, trackball and touchpad inputs could be quantified -
    some smoothing or thersholds could be modified for specific directions
    of movement.(I'd expect trackballs to be particular poor for
    Parkinson's tremor victims) That is, some elliptical or other shaped
    sensitivity, at some angle, around the cursor.
    And, getting back to the visual cues, perhaps an animated cursor would
    help. One with moving stripes?

    Interesting work.

    Somewhat OT but, have you read any Oliver Sacks books? awakenings is
    very good and relates to parkinsons. I also like Island of the
    Colorblind.
     
    1 Lucky Texan, Apr 24, 2011
    #2
    1. Advertisements

  3. D Yuniskis

    D Yuniskis Guest

    <frown> Different animal. Touchscreen you are mapping a
    2 dimensional grid onto a 2 dimensional grid :> Pretty much
    a static undertaking.

    Here, I'm trying to watch a dynamic behavior and use those
    observations to tune filters to try to null out the "undesirable"
    aspects.
    I have some filters that allow me to attenuate (to some extent)
    these extra motions. Of course, they also impact *desired*
    motions... "dem's da brakes". Note that ET sufferers have slightly
    different needs than Parkinsonian tremor -- though they can obviously
    both benefit from similar technologies.

    What I want to do is look at a 2D "track" and, from that, identify
    and characterize the "extra motions". A naive approach is to just
    fit a straight line to the track (begin, end), rotate this to the
    intended orientation (N-S or E-W) and then look at the deviations
    from that ideal as symptoms of this extra motion.
    That only applies if you are using the pointing device *as*
    a "pointing device". I hope to generalize my work for other
    similar uses.
    <shrug> Someone's got to do it. :> (there are technologies
    that try to address these issues; none are open source and I
    suspect often "encumbered")
    I recall a movie, "Awakenings" (Robin Williams lead) that seems
    consistent with your reference. No idea re: _Island of the Colorblind_
    though I will chase down that reference.

    Aside: I am always amazed at how many people are ignorant of the
    prevalence of colorblindness! I mentioned this to a neighbor
    who is a school counselor. She was adamant that I was *wrong*
    in my assertions:

    "That can't be! If that was true, one kid (boy) in every classroom
    would (statistically) suffer from colorblindness!"

    "Exactly!"

    Amusing to think someone responsible for establishing curriculum,
    etc. was unaware of her "customers'" needs.

    (she has, since, conceded her previous ignorance)

    I wonder if she knows the prevalence of ET in her *peers*? :>
     
    D Yuniskis, Apr 24, 2011
    #3
  4. D Yuniskis

    mblume Guest

    Am Sat, 23 Apr 2011 02:10:09 -0700 schrieb D Yuniskis:
    You could provide a "moving target" on the screen, that the user has to
    follow. You know the trajectory of the target, you can record the user's
    reponse.

    Other idea: make the pointer itself "sticky" and less sensitive (ie the user
    has to move the mouse a longer way). A lot of experimentation seems to be
    called for.

    Good luck!
    Martin
     
    mblume, Apr 24, 2011
    #4
  5. Op Sat, 23 Apr 2011 11:10:09 +0200 schreef D Yuniskis
    For a mouse, the results of that will vary depending on:
    - position of mouse mat respective to shoulder
    - position of mouse respective to mouse mat
    - position of hand respective to table edge
    - distribution of crums on mouse mat
    There are different strategies of using a pointer device. When moving a
    visual pointer to a distant position, I like to have the sensitivity high
    and 'swing' the device to the destination in 1-3 jerks. For a close
    destination, the movement is more regular. I would assume that different
    people with different physical disabilities and different amounts of
    muscle memory use different pointing strategies.

    When doing things that don't have a visual pointer, e.g. controlling the
    heading of a spaceship, the "trajectory" followed depends on more complex
    visual cues, I don't think this can be calibrated.
     
    Boudewijn Dijkstra, Apr 24, 2011
    #5
  6. D Yuniskis

    D Yuniskis Guest

    Hi Martin,

    Of sorts. (stiction ~= static friction) Thouch I suspect that's
    not really what's happening in the neuromuscular system at the time!
    ;-)

    I think my dead-band analogy may be more appropriate. I think
    the "control" wants to be applied but is just inhibited until
    some critical point is passed. The more significant issue is
    that the behaviors on the two axes are very different!
    Following a moving target complicates the user's task and, I think,
    requires far more control on his part. One problem with control systems
    is trying to *remove* their influence from the phenomenon being
    measured. Often control systems make the system response *worse* as
    they make assumptions about what the controlled system's
    characteristics actually are and impose their own artifacts on
    the resulting system.

    E.g., you can tune a PID controller until it is "perfect" but, if
    it isn't inherently suited to the process you are trying to control,
    you will usually end up introducing *more* instability to that
    system!

    The goal and response aren't the problem. I'm looking for a *simpler*
    way of crunching a "2D image" to determine the information I need
    (instead of my current "brute force" approach).
    I think "classic" filters won't work. Their onset and release
    behaviors need to vary. But, I can tweak that more when I have
    "better input".
    Thanks!
    --don
     
    D Yuniskis, Apr 24, 2011
    #6
  7. D Yuniskis

    D Yuniskis Guest

    Hi Boudewijn,

    Remember, I'm talking about "pointing devices", not necessarily
    *mice*! E.g., touchpad, head-stick, eye tracker, etc. ...
    .... as they will on each other "pointing technology"!
    Actually, their "forearm angle" seems to be the most
    easily identifiable physical characteristic. (Sticking
    with the mouse device, ) people tend to make mouse
    movements aligned with the worksurface (for want of a better
    phrase). I.e., N-S is away-towards the user while E-W is
    parallel to the display plane (roughly speaking).

    So, if the forearm is at a 45 degree angle to the "display",
    N is not "along the radius/ulna" but, rather, "away from the
    torso" (i.e., the forearm translates *sideways* wrt the forearm
    itself instead of "pure extension").

    In this case, the typical "side to side" tremor appears as
    a diagonal oscillation.

    <frown> My language skills get in the way. Put a pencil in
    your hand. Hold your forearm on a diagonal. With the elbow
    stationary, rock your hand from side to side. You will have
    made a series of overlapping marks *normal* to your forearm
    that appear at an *angle* to your body/worksurface itself.

    Now, move your hand "due north" while making this side to side
    oscillation.

    Repeat this exercise with your forearm pointed "straight ahead"
    (due north). Compare the two traces.

    [The start and end points should be the same, relatively speaking.
    End due north of Start. What I want to do is determine the
    orientation -- and magnitude -- of the "disturbance" that is
    acting "perpendicular to your forearm"]
    For a mouse, this actually does seem to have a big role in
    determining the magnitude of the tremor. Perhaps it's just
    harder to control an extended arm?
    Again, you're thinking just in terms of "pointing". Pointing devices
    can be used in other modes. This is especially important when you
    want to minimize motion -- especially over large distances.

    E.g., imagine using a touch *panel* on a large screen display.
    The motions become exaggerated. You need to design the "screen"
    so that there is more *localized* referencing.
     
    D Yuniskis, Apr 24, 2011
    #7
  8. D Yuniskis

    mblume Guest

    Am Sun, 24 Apr 2011 12:11:20 -0700 schrieb D Yuniskis:
    The idea was that the user's movement is overlaid with noise. By giving a
    trajectory to follow, you could be able to differentiate between the
    voluntary action (trajectory) and the involuntary trembling (noise).
    Of course, the task to be performed has not to be too demanding, otherwise
    it will only frustrate the user.
    This is not a classic filter, because it only moves if the difference
    between the pointer position on screen and the internal pointer position
    exceeds a certain limit and then perhaps only moves a fraction of the
    distance difference. By varying the two constants (deadband, fraction)
    you can take into account the noisiness of the input signal.


    Regards
    Martin
     
    mblume, Apr 25, 2011
    #8
  9. D Yuniskis

    D Yuniskis Guest

    Hi Martin,

    Correct -- if you assume their "movement" is their *intention*
    and their "noise" is their "disease process"...
    Yes. Though simply stating "move the pointing device 'forward'"
    (forward meaning up, down, left, right, whatever) implying a linear
    motion *should* allow me to identify the "noise" atop the "signal".
    You have to think in terms of their psychology, as well.
    If what you are doing is *clearly* trying to "compensate
    for their disability", you risk pissing them off (to put
    it bluntly). Sure, some people are willing to cooperate
    with you when your intentions "seem to be good". But, for
    many (most?), you are just reminding them of something
    of which they are already *keenly* aware. And, run the risk
    of "raising the bar" for the performance they expect of you:
    "Sheesh! Guy made me go through this special procedure
    SUPPOSEDLY to account for my shaking... and the thing *still*
    works like CRAP!" (even though it could be outperforming any
    other "pointing device usage" the user has had previously).

    If, OTOH, you are simply "calibrating the interface" and do so
    FOR ALL USERS in a way that doesn't single out this particular
    type of malady -- yet *magically* seems to work better for those
    users who *have* this sort of tremor condition -- then the user
    feels "normal".

    Some of the comments I've heard from users with "disabilities"
    over the years are specifically oriented towards this sort of
    thing. E.g., a "blind" user once complained that all of the
    products DESIGNED FOR HIM (!) "looked blind". When I told him I
    didn't know what he meant by that phrase, he clarified that the
    devices all "looked like they were produced for blind users".
    And, while he couldn't actually *see* the devices he was talking
    about, I had to admit that his point was "spot on". As he
    later expanded, "this device tells anyone near me that I'm blind,
    even though there is no need for it to do so!".

    E.g., if you examine a wristwatch intended for visually impaired'
    users, at first glance, it just looks like a generic, mechanical
    wristwatch. Kinda boring. Functional. It's only if you carefully
    observe its *use* that you realize how different it is (i.e., the
    crystal is hinged so it can be lifted and the positions of the
    hands "felt" -- but, this action can be done so casually that you
    wouldn't notice it unless you were watching intently)
    Sorry, what I meant by that was not "classic" in the sense of
    Bessel, Butterworth, etc. but, rather, "static", non-adaptive, etc.
    Remember, the sorts of feedback you have available to you aren't
    what a "typical user" might expect. A pointing device need not be
    used to point *to* "things". So, the feedback might NOT be "am I
    pointing *at* that thing, yet?"

    E.g., a classic GUI w/ mouse is relatively easy to adapt. Just
    overdamp the response INTENSELY! But, you will find that the
    user-with-tremor will get tired of trying to interact with such
    a sluggish device. It "feels" like you've strapped a 10 pound
    weight to his wrist...

    A smarter approach, in that case, might be to dynamically vary the
    sensitivity and "filtering" based on proximity to "likely targets"
    (since the mouse-in-GUI context uses the mouse to *literally*
    point *to* things). So, as the mouse-cursor approaches a clickable
    button, for example, you could apply heavier filtering than when
    it was slewing over "open water".
     
    D Yuniskis, Apr 25, 2011
    #9
  10. Could you incorporate a function like, a 'failed click' causes buttons
    and hypertext to increase in size? I can see where this might
    interfere with screen properties menu acquisition - I dunno. (create
    an invisible area around 'radio buttons'/hypertext to detect near
    misses?)

    Not very helpful for 'drag and drop'.
     
    1 Lucky Texan, Apr 26, 2011
    #10
  11. D Yuniskis

    D Yuniskis Guest

    If you limit yourself to the "Windows"/GUI mindset, there are a lot
    of things you "could do".

    And, you'll have a device that *only* works in that environment!
    <frown>

    I'm trying to address the "noise" that is interfering with the
    "signal" by compensating for it "at the source" (or, as near to
    it as is possible without involving biology :> ).

    By way of a perverted example, a person who is hard of hearing
    *could* ask The World to be "a little louder, please". *Or*,
    could outfit their "acoustical perception device" (aka "ear")
    with something that compensates for the problems that "device"
    has -- leaving the rest of the world "as is".

    I.e., why change the GUI/"system" if you can compensate for
    the problem in the "detector"?

    I think I need to drag out some of my old image processing texts
    and see if the solution lies there -- though I think that would
    require dealing with static/off-line data (which throws away
    a lot of information).

    <frown> Maybe just stick with what I've got (boo).
     
    D Yuniskis, Apr 26, 2011
    #11
  12. D Yuniskis

    mblume Guest

    Am Mon, 25 Apr 2011 14:19:17 -0700 schrieb D Yuniskis:
    I was thinking of a moving target that the user had to follow, which
    makes it unambigously clear what is meant, but ...
    .... you have clearly more experience with disabled users
    and have given more thought to it than I have.
    Other idea: make the user press four (rather largish) buttons. Record
    the way the mouse takes (which should normally be a straight line) and from
    the deviations guess the amount of damping / deadband needed.

    Thanks for the insights.
    My idea was to just damp the response to the "right" degree. Overdamping
    makes the system unresponsive, underdamping will make it too nervous.
    Hence the calibration idea.
    This works for buttons to click or straight lines to draw (where you can
    do something like "stick to grid" in Powerpoint), but not for general mouse
    movement (imagine that the user wants to draw a curvy line).

    Regards
    Martin
     
    mblume, Apr 26, 2011
    #12
  13. D Yuniskis

    D Yuniskis Guest

    Hi Martin,

    Understood. But, that makes more assumptions that you'd have
    to evaluate in the application domain:

    - it assumes the pointing device is used in a "pointing" mode
    - it requires visual feedback be provided (*where* is he pointing)
    - it brings eye-hand issues into the equation (another variable?)
    - it imposes some (unknown) sort of "gain" on motion->feedback
    - it defines the magnitudes and directions of the motions sought
    etc.

    I.e., it begs *more* questions (should I try to "calibrate" large
    motions or small ones? how much "practice" should I discount as
    the user gets used to the "gain" of the feedback system?) instead
    of letting me focus on just one at a time.
    <shrug> I've just been around a wide variety of different
    users in different application domains and actively note
    the problems they have "using things" -- regardless of
    whether or not they have a "disability", etc.

    E.g., a "tradesman" probably has a "heavier touch" than
    an "office worker" so the actuating force required for
    controls intended for use by one would differ from those
    intended for use by the other...
    Again, this drags in all of the above issues. Along with
    the idiosyncracies (sp?) of the pointing device in question.

    If I say, "move the <pointing_device> STRAIGHT forward"
    *without* qualifying it as to distance, speed, *actual*
    orientation, etc. then, I can look at what the user
    considers (physically) "straight" to be. If the trajectory
    is "straight as an arrow", displaces a "significant distance"
    and is at a low enough rate (that would allow the ET "noise"
    to be evident), then I can deduce there are no significant
    signs of tremor.

    OTOH, if there *are* signs -- or, if the user's choice of
    rate/distance/orientation/etc. suggest "further observation
    required", I can prompt the user to "do that again, only
    slower/faster/etc."

    And, there is never an "obvious" situation where the user feels
    challenged/threatened/intimidated (an obvious concern is if I end
    up saying "faster! No, *slower*. No, not *that* slow. Wait,
    not that *fast*...")
    Yes. Unfortunately, people don't behave like control systems
    that can be easily "tuned" :-/
    Exactly! E.g., consider a gestural interface built upon the pointing
    device. You would like a rich gesture lexicon -- but that requires
    more capability from the user (in terms of getting his "signal" thru
    that "noise"). It would be unacceptable, for instance, if certain
    users could not issue certain gestures (and thus the objects that those
    gestures represent!)
     
    D Yuniskis, Apr 27, 2011
    #13
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.