Iso/iec jtc 1/sc 29 N



Yüklə 1,86 Mb.
səhifə6/19
tarix19.10.2018
ölçüsü1,86 Mb.
#74906
növüApplication form
1   2   3   4   5   6   7   8   9   ...   19

Programming information


The following programming related node is used in ARAF: Script.

Script

XSD Description























Functionality and semantics

As defined in ISO/IEC 14772-1:1998, section 6.40

The Script node is used to program behaviour in a scene. Script nodes typically



  • signify a change or user action;

  • receive events from other nodes;

  • contain a program module that performs some computation;

  • effect change somewhere else in the scene by sending events.

Each Script node has associated programming language code, referenced by the url field, that is executed to carry out the Script node's function. That code is referred to as the "script" in the rest of this description. Details on the url field can be found in ISO/IEC 14772-1:1997, section 4.5, VRML and the World Wide Web.

Browsers are not required to support any specific language. Detailed information on scripting languages is described in ISO/IEC 14772-1:1997, section 4.12, Scripting. Browsers supporting a scripting language for which a language binding is specified shall adhere to that language binding.

Sometime before a script receives the first event it shall be initialized (any language-dependent or user-defined initialize() is performed). The script is able to receive and process events that are sent to it. Each event that can be received shall be declared in the Script node using the same syntax as is used in a prototype definition:

eventIn type name

The type can be any of the standard VRML fields (as defined in ISO/IEC 14772-1:1997, section 5, Field and event reference). Name shall be an identifier that is unique for this Script node.

The Script node is able to generate events in response to the incoming events. Each event that may be generated shall be declared in the Script node using the following syntax:

eventOut type name

With the exception of the url field, exposedFields are not allowed in Script nodes.

If the Script node's mustEvaluate field is FALSE, the browser may delay sending input events to the script until its outputs are needed by the browser. If the mustEvaluate field is TRUE, the browser shall send input events to the script as soon as possible, regardless of whether the outputs are needed. The mustEvaluate field shall be set to TRUE only if the Script node has effects that are not known to the browser (such as sending information across the network). Otherwise, poor performance may result.

Once the script has access to a VRML node (via an SFNode or MFNode value either in one of the Script node's fields or passed in as an eventIn), the script is able to read the contents of that node's exposed fields. If the Script node's directOutput field is TRUE, the script may also send events directly to any node to which it has access, and may dynamically establish or break routes. If directOutput is FALSE (the default), the script may only affect the rest of the world via events sent through its eventOuts. The results are undefined if directOutput is FALSE and the script sends events directly to a node to which it has access.

A script is able to communicate directly with the VRML browser to get information such as the current time and the current world URL. This is strictly defined by the API for the specific scripting language being used.

The location of the Script node in the scene graph has no affect on its operation. For example, if a parent of a Script node is a Switch node with whichChoice set to "-1" (i.e., ignore its children), the Script node continues to operate as specified (i.e., it receives and sends events).


User interactivity


The following user interactivity related nodes are used in ARAF: InputSensor, SphereSensor, TimeSensor, TouchSensor, MediaSensor, PlaneSensor.

InputSensor

XSD Description





















Functionality and semantics

As defined in ISO/IEC 14496-11 (BIFS), section 7.2.2.71.2.

The InputSensor node is used to add entry points for user inputs into a BIFS scene. It allows user events to trigger updates of the value of a field or the value of an element of a multiple field of an existing node.

Input devices are modelled as devices that generate frames of user input data. A device data frame (DDF) consists in a list of values of any of the allowed types for node fields. Values from DDFs are used to update the scene. For example, the DDF definition for a simple mouse is:

MouseDataFrame [

SFVec2f cursorPosition

SFBool singleButtonDown

]

NOTE — The encoding of the DDF is implementation-dependent. Devices may send only complete DDF or sometimes subsets of DDF as well.The buffer field is a buffered bit string which contains a list of BIFS-Commands in the form of a CommandFrame (see ISO/IEC 14496-11, section 8.6.2). Allowed BIFS-Commands are the following: FieldReplacement (see ISO/IEC 14496-11, section 8.6.21), IndexedValueReplacement (see ISO/IEC 14496-11, section 8.6.22) and NodeDeletion with a NULL node argument (see ISO/IEC 14496-11, section 8.7.3.2). The buffer shall contain a number of BIFSCommands that matches the number of fields in the DDF definition for the attached device. The type of the field replaced by the nth command in the buffer shall match the type of the nth field in the DDF definition.



The url field specifies the data source to be used (see ISO/IEC 14496-11, section 7.1.1.2.7.1). The url field shall point to a stream of type UserInteractionStream, which “access units” are DDFs. When the enabled is set to TRUE, upon reception of a DDF, each value (in the order of the DDF definition) is placed in the corresponding replace command according to the DDF definition, and then the replace command is executed. These updates are not time-stamped; they are executed at the time of the event, assuming a zero-decoding time. It is not required that all the replace commands be executed when the buffer is executed. Each replace command in the buffer can be independently triggered depending on the data present in the current DDF. Moreover, the presence in the buffer field of a NodeDeletion command at the nth position indicates that the value of the DDF corresponding to the nth field of the DDF definition shall be ignored.

The eventTime eventOut carrying the current time is generated after a DDF has been processed.


OutputActuator

XSD Description





BIFS Textual Description

EXTERNPROTO OutputActuator [

eventIn SFBool activate

exposedField SFBool enabled TRUE

exposedField MFString url []



Any number of the following may then follow:

eventIn eventType DDFEventName

]”"org:mpeg:outputActuator"

Functionality and semantics

The OutputActuator proto is used to communicate between the scene and the MPEG-V actuator. How to map these commands to the physical device is out of the scope of this standard. It should be noted that the device interprets the command and produces the effect immediately when the command is received. The proto definition of OutputActuator is described below.

The OutputActuator PROTO can receive variable number of events that in turn generate Device Data Frames (DDFs) that are sent to the actuator. Each eventIn corresponds to one field in the DDF and has the same type. When activate eventIn is received, the DDF is assembled and sent to the device.

When declaring an OutputActuator in a BIFS scene, the eventIn fields shall be placed in their order of appearance in the associated DDF, after all other fields are declared. The activate eventIn shall be declared first in the extern proto declaration.

The url field specifies the device to be controled.

Only if the enabled field is TRUE then the DDFs are generated.

The mandatory input events of the OutputActuator interface are as follows:



Actuator Type

Input events, in the given order

Input event meaning

Light

SFFloat, SFColor

intensity, color

Vibration

SFFloat

intensity

Tactile

MFFloat

intensity

Flash

SFFloat, SFColor

intensity, color

Heating

SFFloat

intensity

Cooling

SFFloat

intensity

Wind

SFFloat

intensity

Sprayer

SFFloat, SFInt32

intensity, sprayingType

Scent

SFFloat, SFint32

intensity, scent

Fog

SFFloat

intensity

Rigid Body Motion

MFVec3f, MFVec3f, MFVec3f, MFVec3f, MFVec3f, MFVec3f

direction, speed, acceleration, angle, angleSpeed, angleAcceleration

Kinesthetic

MFVec3f, MFVec3f, MFVec3f, MFVec3f

position, orientation, force, torque

In the following example two actuators of different types (Light and Vibration) are presented. In this case the EXTERNPROTO OutputActuator is declared twice as follows:

EXTERNPROTO LightActuator [

eventIn SFBool activate

exposedField SFBool enabled TRUE

exposedField MFString url [hw://lightDevice1]

eventIn SFFloat intensity

eventIn SFColor color

]"org:mpeg:outputactuator"
EXTERNPROTO VibrationActuator [

eventIn SFBool activate

exposedField SFBool enabled TRUE

exposedField MFString url [hw://vibrationDevice1]

eventIn SFFloat intensity

]"org:mpeg:outputactuator"
This could be instantiated in the scene as follows:
DEF SI_LIGHT ScalarInterpolator {

Key [0 0.5 1]

keyValues [0.1 0.2 0.3]

}
DEF SI_VIBRATION ScalarInterpolator {

Key [0 0.5 1]

keyValues [0.1 0.2 0.3]

}
DEF LIGHT_1 LightActuator {}

DEF LIGHT_2 LightActuator {

url [hw://lightDevice2]

}
DEF VIBRATION_1 VibrationActuator {}

DEF VIBRATION_2 VibrationActuator {

enabled FALSE

url [hw://vibrationDevice2]

}
ROUTE SI_LIGHT.value_changed TO LIGHT_1.intensity

ROUTE SI_LIGHT.value_changed TO LIGHT_2.intensity
ROUTE SI_VIBRATION.value_changed TO VIBRATION_1.intensity

ROUTE SI_VIBRATION.value_changed TO VIBRATION_2.intensity

SphereSensor

XSD Description





















Functionality and semantics

As specified in ISO/IEC 14772-1:1997, section 6.44.

The SphereSensor node maps pointing device motion into spherical rotation about the origin of the local coordinate system. The SphereSensor node uses the descendent geometry of its parent node to determine whether it is liable to generate events.

The enabled exposed field enables and disables the SphereSensor node. If enabled is TRUE, the sensor reacts appropriately to user events. If enabled is FALSE, the sensor does not track user input or send events. If enabled receives a FALSE event and isActive is TRUE, the sensor becomes disabled and deactivated, and outputs an isActive FALSE event. If enabled receives a TRUE event the sensor is enabled and ready for user activation.

The SphereSensor node generates events when the pointing device is activated while the pointer is indicating any descendent geometry nodes of the sensor's parent group. See ISO/IEC 14772-1:1997, section 4.6.7.5, Activating and manipulating sensors, for details on using the pointing device to activate the SphereSensor.

Upon activation of the pointing device (e.g., mouse button down) over the sensor's geometry, an isActive TRUE event is sent. The vector defined by the initial point of intersection on the SphereSensor's geometry and the local origin determines the radius of the sphere that is used to map subsequent pointing device motion while dragging. The virtual sphere defined by this radius and the local origin at the time of activation is used to interpret subsequent pointing device motion and is not affected by any changes to the sensor's coordinate system while the sensor is active. For each position of the bearing, a rotation_changed event is sent which corresponds to the sum of the relative rotation from the original intersection point plus the offset value. trackPoint_changed events reflect the unclamped drag position on the surface of this sphere. When the pointing device is deactivated and autoOffset is TRUE, offset is set to the last rotation_changed value and an offset_changed event is generated. See ISO/IEC 14772-1:1997, section 4.6.7.4, Drag sensors, for more details.

When the sensor generates an isActive TRUE event, it grabs all further motion events from the pointing device until it is released and generates an isActive FALSE event (other pointing-device sensors shall not generate events during this time). Motion of the pointing device while isActive is TRUE is termed a "drag". If a 2D pointing device is in use, isActive events will typically reflect the state of the primary button associated with the device (i.e., isActive is TRUE when the primary button is pressed and FALSE when it is released). If a 3D pointing device (e.g., wand) is in use, isActive events will typically reflect whether the pointer is within (or in contact with) the sensor's geometry.

While the pointing device is activated, trackPoint_changed and rotation_changed events are output. trackPoint_changed events represent the unclamped intersection points on the surface of the invisible sphere. If the pointing device is dragged off the sphere while activated, browsers may interpret this in a variety of ways (e.g., clamp all values to the sphere or continue to rotate as the point is dragged away from the sphere). Each movement of the pointing device while isActive is TRUE generates trackPoint_changed and rotation_changed events.

TimeSensor

XSD Description

























Functionality and semantics

As specified in ISO/IEC 14772-1:1997, section 6.50.

TimeSensor nodes generate events as time passes. TimeSensor nodes can be used for many purposes including:

driving continuous simulations and animations;

controlling periodic activities (e.g., one per minute);

initiating single occurrence events such as an alarm clock.

The TimeSensor node contains two discrete eventOuts: isActive and cycleTime. The isActive eventOut sends TRUE when the TimeSensor node begins running, and FALSE when it stops running. The cycleTime eventOut sends a time event at startTime and at the beginning of each new cycle (useful for synchronization with other time-based objects). The remaining eventOuts generate continuous events. The fraction_changed eventOut, an SFFloat in the closed interval [0,1], sends the completed fraction of the current cycle. The time eventOut sends the absolute time for a given simulation tick.

If the enabled exposedField is TRUE, the TimeSensor node is enabled and may be running. If a set_enabled FALSE event is received while the TimeSensor node is running, the sensor performs the following actions:


  • evaluates and sends all relevant outputs;

  • sends a FALSE value for isActive;

  • disables itself.

Events on the exposedFields of the TimeSensor node (e.g., set_startTime) are processed and their corresponding eventOuts (e.g., startTime_changed) are sent regardless of the state of the enabled field. The remaining discussion assumes enabled is TRUE.

The loop, startTime, and stopTime exposedFields and the isActive eventOut and their effects on the TimeSensor node are discussed in detail in 4.6.9, Time-dependent nodes. The "cycle" of a TimeSensor node lasts for cycleInterval seconds. The value of cycleInterval shall be greater than zero.

A cycleTime eventOut can be used for synchronization purposes such as sound with animation. The value of a cycleTime eventOut will be equal to the time at the beginning of the current cycle. A cycleTime eventOut is generated at the beginning of every cycle, including the cycle starting at startTime. The first cycleTime eventOut for a TimeSensor node can be used as an alarm (single pulse at a specified time).

When a TimeSensor node becomes active, it generates an isActive = TRUE event and begins generating time, fraction_changed, and cycleTime events which may be routed to other nodes to drive animation or simulated behaviours. The behaviour at read time is described below. The time event sends the absolute time for a given tick of the TimeSensor node (time fields and events represent the number of seconds since midnight GMT January 1, 1970).



fraction_changed events output a floating point value in the closed interval [0, 1]. At startTime the value of fraction_changed is 0. After startTime, the value of fraction_changed in any cycle will progress through the range (0.0, 1.0]. At startTime +   × cycleInterval, for N = 1, 2, ..., that is, at the end of every cycle, the value of fraction_changed is 1.

Let now represent the time at the current simulation tick. Then the time and fraction_changed eventOuts can then be computed as:



time = now

temp = (now - startTime) / cycleInterval

f = fractionalPart(temp)

if (f == 0.0 && now > startTime) fraction_changed = 1.0

else fraction_changed = f
where fractionalPart(x) is a function that returns the fractional part, (that is, the digits to the right of the decimal point), of a nonnegative floating point number.

A TimeSensor node can be set up to be active at read time by specifying loop TRUE (not the default) and stopTime less than or equal to startTime (satisfied by the default values). The time events output absolute times for each tick of the TimeSensor node simulation. The time events shall start at the first simulation tick greater than or equal to startTime. time events end at stopTime, or at startTime +   × cycleInterval for some positive integer value of N, or loop forever depending on the values of the other fields. An active TimeSensor node shall stop at the first simulation tick when now >= stopTime > startTime.

No guarantees are made with respect to how often a TimeSensor node generates time events, but a TimeSensor node shall generate events at least at every simulation tick. TimeSensor nodes are guaranteed to generate final time and fraction_changed events. If loop is FALSE at the end of the Nth cycleInterval and was TRUE at startTime + M × cycleInterval for all 0 < M < N, the final time event will be generated with a value of (startTime + N × cycleInterval) or stopTime (if stopTime startTime), whichever value is less. If loop is TRUE at the completion of every cycle, the final event is generated as evaluated at stopTime (if stopTime startTime) or never.

An active TimeSensor node ignores set_cycleInterval and set_startTime events. An active TimeSensor node also ignores set_stopTime events for set_stopTime less than or equal to startTime. For example, if a set_startTime event is received while a TimeSensor node is active, that set_startTime event is ignored (the startTime field is not changed, and a startTime_changed eventOut is not generated). If an active TimeSensor node receives a set_stopTime event that is less than the current time, and greater than startTime, it behaves as if the stopTime requested is the current time and sends the final events based on the current time (note that stopTime is set as specified in the eventIn).

A TimeSensor read from a VRML file shall generate isActive TRUE, time and fraction_changed events if the sensor is enabled and all conditions for a TimeSensor to be active are met.

TouchSensor

XSD Description

















Functionality and semantics

As specified in ISO/IEC 14772-1:1997, section 6.51.

A TouchSensor node tracks the location and state of the pointing device and detects when the user points at geometry contained by the TouchSensor node's parent group. A TouchSensor node can be enabled or disabled by sending it an enabled event with a value of TRUE or FALSE. If the TouchSensor node is disabled, it does not track user input or send events.

The TouchSensor generates events when the pointing device points toward any geometry nodes that are descendants of the TouchSensor's parent group. See ISO/IEC 14772-1:1997, section 4.6.7.5, Activating and manipulating sensors, for more details on using the pointing device to activate the TouchSensor.

The isOver eventOut reflects the state of the pointing device with regard to whether it is pointing towards the TouchSensor node's geometry or not. When the pointing device changes state from a position such that its bearing does not intersect any of the TouchSensor node's geometry to one in which it does intersect geometry, an isOver TRUE event is generated. When the pointing device moves from a position such that its bearing intersects geometry to one in which it no longer intersects the geometry, or some other geometry is obstructing the TouchSensor node's geometry, an isOver FALSE event is generated. These events are generated only when the pointing device has moved and changed `over' state. Events are not generated if the geometry itself is animating and moving underneath the pointing device.

As the user moves the bearing over the TouchSensor node's geometry, the point of intersection (if any) between the bearing and the geometry is determined. Each movement of the pointing device, while isOver is TRUE, generates hitPoint_changed, hitNormal_changed and hitTexCoord_changed events. hitPoint_changed events contain the 3D point on the surface of the underlying geometry, given in the TouchSensor node's coordinate system. hitNormal_changed events contain the surface normal vector at the hitPoint. hitTexCoord_changed events contain the texture coordinates of that surface at the hitPoint. The values of hitTexCoord_changed and hitNormal_changed events are computed as appropriate for the associated shape.

If isOver is TRUE, the user may activate the pointing device to cause the TouchSensor node to generate isActive events (e.g., by pressing the primary mouse button). When the TouchSensor node generates an isActive TRUE event, it grabs all further motion events from the pointing device until it is released and generates an isActive FALSE event (other pointing-device sensors will not generate events during this time). Motion of the pointing device while isActive is TRUE is termed a "drag." If a 2D pointing device is in use, isActive events reflect the state of the primary button associated with the device (i.e., isActive is TRUE when the primary button is pressed and FALSE when it is released). If a 3D pointing device is in use, isActive events will typically reflect whether the pointing device is within (or in contact with) the TouchSensor node's geometry.

The eventOut field touchTime is generated when all three of the following conditions are true:


  • The pointing device was pointing towards the geometry when it was initially activated (isActive is TRUE).

  • The pointing device is currently pointing towards the geometry (isOver is TRUE).

  • The pointing device is deactivated (isActive FALSE event is also generated).

In a 2D context, there are restrictions on the SFVec3f eventOuts:

  • hitNormal_changed always returns [0.0, 0.0, 1.0]

  • hitPoint_changed always has 0.0 as Z coordinate.

MediaSensor

XSD Description

















Functionality and semantics

As defined in ISO/IEC 14496-11 (BIFS), section 7.2.2.71.2.

The MediaSensor node monitors the availability and presentation status of one or more stream objects.

The url field identifies a list of stream objects monitored by the MediaSensor node. All the stream objects in the url field shall belong to the same media stream. A stream object is considered to be available when any of its composition units is available in the composition buffer and is due for composition at that time. A stream object is considered to be no longer available when it is paused or stopped. A stream object is considered to “become available” when it “is available” for the first time. When there are several monitored stream objects available at the same time, the fields in the MediaSensor convey information about the stream object that became available last. If the stream that last became available becomes inactive, the MediaSensor node shall convey information about the first active stream in its url field. The isActive event sends a TRUE value each time one of the monitored stream objects referred by the url field becomes available, and a FALSE value when all of them become not available. Whenever a new composition unit is due for composition, a mediaCurrentTime event is sent and indicates the media time of that composition unit within the stream object.

The streamObjectStartTime event conveys the start of the stream object within a stream, relative to media time zero of the whole stream. The mediaDuration event conveys the duration of the stream object in seconds. It is set to –1 if this duration is unknown. The info event conveys information about the stream object that is currently monitored. Its first element identifies the stream object using the same syntax as in the url field.

The streamObjectStartTime, mediaDuration and info events are triggered when any stream object in the url field becomes available.

PlaneSensor

XSD Description

"PlaneSensorType">



"xmta:IS" minOccurs="0"/>



"autoOffset" type="xmta:SFBool" use="optional" default="true"

/>

"enabled" type="xmta:SFBool" use="optional" default="true"

/>

"maxPosition" type="xmta:SFVec2f" use="optional" default="-1 -1"

/>

"minPosition" type="xmta:SFVec2f" use="optional" default="0 0"

/>

"offset" type="xmta:SFVec3f" use="optional" default="0 0 0"

/>

"xmta:DefUseGroup"/>



"PlaneSensor" type="xmta:PlaneSensorType"/>
Functionality and semantics

As specified in ISO/IEC 14772-1:1997, section 6.34.

The PlaneSensor node maps pointing device motion into two-dimensional translation in a plane parallel to the Z=0 plane of the local coordinate system. The PlaneSensor node uses the descendent geometry of its parent node to determine whether it is liable to generate events.

The enabled exposedField enables and disables the PlaneSensor. If enabled is TRUE, the sensor reacts appropriately to user events. If enabled is FALSE, the sensor does not track user input or send events. If enabled receives a FALSE event and isActive is TRUE, the sensor becomes disabled and deactivated, and outputs an isActive FALSE event. If enabled receives a TRUE event, the sensor is enabled and made ready for user activation.

The PlaneSensor node generates events when the pointing device is activated while the pointer is indicating any descendent geometry nodes of the sensor's parent group. See ISO/IEC 14772-1:1997 Section 4.6.7.5, Activating and manipulating sensors, for details on using the pointing device to activate the PlaneSensor.

Upon activation of the pointing device (e.g., mouse button down) while indicating the sensor's geometry, an isActive TRUE event is sent. Pointer motion is mapped into relative translation in the tracking plane, (a plane parallel to the sensor's local Z=0 plane and coincident with the initial point of intersection). For each subsequent movement of the bearing, a translation_changed event is output which corresponds to the sum of the relative translation from the original intersection point to the intersection point of the new bearing in the plane plus the offset value. The sign of the translation is defined by the Z=0 plane of the sensor's coordinate system. trackPoint_changed events reflect the unclamped drag position on the surface of this plane. When the pointing device is deactivated and autoOffset is TRUE, offset is set to the last translation_changed value and an offset_changed event is generated. More details are provided in See ISO/IEC 14772-1:1997 Section 4.6.7.4, Drag sensors.

When the sensor generates an isActive TRUE event, it grabs all further motion events from the pointing device until it is deactivated and generates an isActive FALSE event. Other pointing-device sensors shall not generate events during this time. Motion of the pointing device while isActive is TRUE is referred to as a "drag." If a 2D pointing device is in use, isActive events typically reflect the state of the primary button associated with the device (i.e., isActive is TRUE when the primary button is pressed, and is FALSE when it is released). If a 3D pointing device (e.g., wand) is in use, isActive events typically reflect whether the pointer is within or in contact with the sensor's geometry.



minPosition and maxPosition may be set to clamp translation_changed events to a range of values as measured from the origin of the Z=0 plane. If the X or Y component of minPosition is greater than the corresponding component of maxPosition, translation_changed events are not clamped in that dimension. If the X or Y component of minPosition is equal to the corresponding component of maxPosition, that component is constrained to the given value. This technique provides a way to implement a line sensor that maps dragging motion into a translation in one dimension.

While the pointing device is activated and moved, trackPoint_changed and translation_changed events are sent. trackPoint_changed events represent the unclamped intersection points on the surface of the tracking plane. If the pointing device is dragged off of the tracking plane while activated (e.g., above horizon line), browsers may interpret this in a variety ways (e.g., clamp all values to the horizon). Each movement of the pointing device, while isActive is TRUE, generates trackPoint_changed and translation_changed events.

Further information about this behaviour can be found in See ISO/IEC 14772-1:1997 Section 4.6.7.3, Pointing-device sensors, See ISO/IEC 14772-1:1997 Section 4.6.7.4, Drag sensors, and See ISO/IEC 14772-1:1997 Section 4.6.7.5, Activating and manipulating sensors.


Yüklə 1,86 Mb.

Dostları ilə paylaş:
1   2   3   4   5   6   7   8   9   ...   19




Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©genderi.org 2024
rəhbərliyinə müraciət

    Ana səhifə