Skip to content

Latest commit

 

History

History
311 lines (217 loc) · 9.46 KB

File metadata and controls

311 lines (217 loc) · 9.46 KB

Themes

Set your presentation theme:
Black (default) - White - League - Sky - Beige - Simple
Serif - Night - Moon - Solarized

H:

Interaction in nub

Jean Pierre Charalambos
Universidad Nacional de Colombia
Presentation best seen online
See also the source code

H:

Index

  1. Goal
  2. Nub design
  3. Applications
  4. Future work

H:

Goal

Provide interactivity to application objects from any input source

in the 'simplest' possible way

V:

Goal: Main interaction tasks

Three main interaction tasks (see 'A Survey of Interaction Techniques for Interactive 3D Environments', Jankowski et al):

  • Navigation
  • Picking and interaction
  • Application control

    V:

    Goal: Main interaction tasks

    1. 2D & 3D Navigation

    Basic camera types:

  • Orbit-like methods
  • First person
  • Third person

    V:

    Goal: Main interaction tasks

    2. Picking & Interaction

  • Picking strategies: from input sources, programmatically
  • Interaction: emulate 6 DOF's (Default behavior from multiple _input Sources_)

    V:

    Goal: Main interaction tasks

    3. Application control (custom behaviors)

    Post-WIMP interaction styles

  • Interfaces “containing at least one interaction technique not dependent on classical 2D widgets” [[van Dam]](http://dl.acm.org/citation.cfm?id=253708), such as:
  • Virtual, mixed and augmented reality
  • [Tangible interaction](https://en.wikipedia.org/wiki/Tangible_user_interface), ubiquitous and pervasive computing, context-aware computing
  • Handheld, or mobile interaction
  • Perceptual and [affective computing](https://en.wikipedia.org/wiki/Affective_computing)

    N:

    WIMP: "window, icon, menu, pointing device" classical 2D widgets: menus and icons

    H:

    Nub Design

    1. API considerations
    2. The eye
    3. Picking & Interaction
    4. Application Control

    V:

    Nub Design

    API considerations

    The scene is a high-level Processing scene-graph handler

    A node encapsulates a 2D/3D coordinate system

    Simplicity: A scene and some nodes attached to it implement the 3MITs

    V:

    Nub Design

    The scene

    High-level scene-graph object which provides eye, input and timing handling to Processing

    The scene also implements some drawing routines not present in Processing

    V:

    Nub Design

    Default eye

     World
      ^
       \
        eye
    Scene scene;
    // to be run only at start up:
    void setup() {
      Scene scene = new Scene();
    }

    The scene eye is just a Node instance

    V:

    Nub Design

    Hierarchy setup

     World
      ^
      |\
      1 eye
      ^
      |\
      2 3
    Scene scene;
    Node n1, n2, n3;
    // to be run only at start up:
    void setup() {
      Scene scene = new Scene();
      // creates a hierarchy of 'attached-nodes'
      n1 = new Node(scene);
      n2 = new Node(n1);
      n3 = new Node(n1) {
        // note that within graphics the geometry is defined
        // at the node local coordinate system
        @Override
        public boolean graphics(PGraphics pg) {
          pg.sphere(50);
        }
      };
    }

    Override the Node graphics(PGraphics) method to customize the node appearance.

    V:

    Nub Design

     World
      ^
      |\
      1 eye
      ^
      |\
      2 3
    // to be run continously
    void draw() {
      scene.render();
    }

    V:

    Nub Design

    Picking & Interaction

    1. Picking -> Tag a node using an arbitrary name
    1. Interaction -> Converts user gesture data into a node interaction

    V:

    Nub Design

    Picking & Interaction

    tag(tag, node)

  • Low-level ray casting (yes/no): [tracks(node, pixelX, pixelY)](https://visualcomputing.github.io/nub-javadocs/nub/core/Graph.html#tracks-nub.core.Node-int-int-)
  • (Optimized) High-level ray casting (updates the tagged-node): [tag(tag, pixelX, pixelY)](https://visualcomputing.github.io/nub-javadocs/nub/core/Graph.html#tag-java.lang.String-int-int-)

    V:

    Nub Design

    Picking & Interaction

    Node interaction pattern: interactNode(node, gesture...)

    Tagged node interaction pattern: interactTag(tag, gesture...) : interactNode(node(tag), gesture...)

  • ```gesture...``` is a [varargs](https://docs.oracle.com/javase/8/docs/technotes/guides/language/varargs.html) argument defining the [screen-space](https://github.com/VisualComputing/nub#space-transformations) user gesture.
  • [node(tag)](https://visualcomputing.github.io/nub-javadocs/nub/core/Graph.html#node-java.lang.String-) returns the node tagged with ```tag```, which may be ```null```.
  • Details and code snippets [here](https://github.com/VisualComputing/nub#interactivity).

    V:

    Nub Design

    Picking & Interaction: Custom node behaviors

    Override the node interact(gesture...) method and then invoke it with the node interaction or tagged node interaction patterns above.

  • Check the [CustomNodeInteraction](https://github.com/VisualComputing/nub/blob/master/examples/demos/CustomNodeInteraction/CustomNodeInteraction.pde) example.
  • To customize the eye behavior, refer to the [CustomEyeInteraction](https://github.com/VisualComputing/nub/tree/master/examples/demos/CustomEyeInteraction) example.

    H:

    Applications

    • Navigation
    • Picking and interaction
    • Application control

    V:

    Applications

    Navigation

    Orbit-like methods

    All examples using a mouse

    V:

    Applications

    Navigation

    First & third person

    V:

    Applications

    Picking

    FIXED & ADAPTIVE precision

    V:

    Applications

    Picking

    EXACT precision

    V:

    Applications

    Interaction

    Standard devices

    All examples using a mouse and/or a keyboard

    V:

    Applications

    Interaction

    Non-conventional devices

    V:

    Applications

    Application control

    H:

    References