XR Glossary
- Last UpdatedJul 18, 2024
- 27 minute read
This glossary provides a list terms and definitions that may appear in the AVEVA™ XR documentation.
Alpha channel
|
The alpha channel is an image channel in an additive color system that carries transparency information. An image has three standard color channels: red, green and blue (RGB). The alpha channel is the fourth channel (A), which carries and stores transparency information. An image without an alpha channel is always opaque. |
Ambient Occlusion (AO)
|
Ambient Occlusion is a shading and rendering effect that calculates how surfaces are exposed to ambient lighting. The effect is to emphasize differences between points where the geometry is more or less exposed to ambient light. |
Animated texture
|
An animated texture is a texture that moves through the dynamic shifting of the UV mapping values of the mesh, such as the surface of water. |
Anti-aliasing
|
Anti-aliasing is a computer rendering, post-processing technique that changes the color of some image pixels that define a sharp edge so that it looks less jagged, or softer. |
Application programming interface (API)
|
An application programming interface (API) is a computing interface that defines interactions between multiple software intermediaries. An API defines the calls or requests that can be made, how to make them, the data formats that should be used, the conventions to follow, and so on. It can also provide extension mechanisms so that users can extend existing functionality in various ways. |
Asset Performance Management (APM)
|
An Asset Performance Management (APM) system is a system that acts to improve the reliability and availability of physical assets, while minimizing risk and operating costs. APM improves integration between production management (making the product) and asset management (ensuring the capability to produce). |
Attach point
|
An attach point is an engine node that represents a position in the coordinate system where a manipulable item can be positioned. |
Augmented Reality (AR)
|
Augmented reality (AR) is an interactive experience of a real-world environment where objects in the real-world are enhanced by computer-generated perceptual information, such as visual and auditory information. |
Avatar
|
An avatar is an engine node responsible for character movements, camera handling, and the representation of a human character inside the virtual world. |
AVEVA™ Asset Information Management (AIM)
|
The AVEVA™ Asset Information Management (AIM) service is a solution for managing, accessing, visualizing, and comparing technical data and documents from multiple sources. AIM provides a centralized interface for full visualization of an entire digital asset collection associated with an industrial site—including schematics, P&IDs, object attributes, 3D models, and isometrics. Behind the scenes, the AIM service transforms this data from multiple information systems into trusted resources for use in other software projects, while delivering operational safety, asset information integrity, and reduced risk on data. |
AVEVA™ Dynamic Simulation
|
AVEVA™ Dynamic Simulation (formerly DYNSIM®) is a process simulator that combines first principles process models, and thermodynamic methods and data with an integrated graphical user interface (GUI) to produce highly accurate and rigorous dynamic simulation and high-fidelity control system emulation. |
AVEVA™ Licensing System (ALS)
|
AVEVA™ Licensing System (ALS) is a centralized license server for managing AVEVA products. |
AVEVA™ NET
|
AVEVA™ Net is the former name for AVEVA™ Asset Information Management (AIM). |
AVEVA™ Point Cloud Manager
|
AVEVA™ Point Cloud Manager (formerly AVEVA LFM) is AVEVA’s cloud-enabled 3D Data Capture solution for registering, processing and visualizing point cloud, 3D model data on brownfield, greenfield, and maintenance projects. |
Axis
|
An axis is a direction of dimension passing by the World center at its zero value.
|
Beaufort scale
|
The Beaufort scale is an empirical measure that relates wind speed to observed conditions at sea or on land. Its full name is the Beaufort wind force scale. |
Beckmann roughness model
|
The Beckmann roughness model is based on an isotropic gaussian distribution of slopes. Its roughness parameter corresponds to the standard deviation (RMS) of the slope. A value of zero corresponds to a perfectly smooth specular surface. |
Billboard
|
A billboard is a flat-surfaced object placed in the 3D world that changes its orientation depending on the camera position. Billboards are generally used to create impostors of 3D objects. For example, trees can be very complex to render, so the tree has one rendered side and always faces the camera. |
Binary Space Partitioning (BSP)
|
Binary Space Partitioning (BSP) is a method for recursively subdividing a space into two convex sets by using hyperplanes as partitions. This process of subdividing gives rise to a representation of objects within the space in the form of a tree data structure known as a BSP tree. It provides a way to divide space in order to have all 3D scene contents organized. |
Blender
|
Blender is a free and open-source 3D computer graphics software toolset used for creating animated films, visual effects, art, 3D printed models, motion graphics, interactive 3D applications, virtual reality, and computer games. |
Boolean operations
|
There are four main Boolean operations in 3D modeling:
|
Bounding box
|
A bounding box is the smallest bounding or enclosing box containing all vertices of a mesh. |
Bridge Manager (now XR Bridge)
|
XR Bridge, formerly the Bridge Manager handles communications channels between various components of the AVEVA XR system and/or external applications. |
Camera
|
The camera in the 3D world is an engine node representing a virtual camera. Like a real-world camera, the 3D camera frames the view of a scene by tracking, tumbling, panning, and zooming. Unlike a real-world camera, the 3D camera does not automatically capture lighting, motion blur, and other effects. These effects must be explicitly created and tuned for realistic output. |
Cascaded Shadow Map (CSM)
|
Cascaded Shadow Maps (CSMs) is a shading technique to overcome perspective aliasing, which is one of the most common errors that occurs with shadowing. |
Channel (bridge)
|
A communications channel used by XR Bridge. |
Channel (light)
|
Cascaded Shadow Maps (CImage files are divided into channels, which are grayscale single-channel images used to define the intensity of each primary color. For RGB light emitting images, the channels are red, green, and blue. Some formats can hold additional channels, such as alpha (transparency). |
Clipmap
|
Using a clipmap is a technique for clipping a mipmap to a subset of data that is pertinent to the geometry being displayed. This is useful for loading as little data as possible when memory is limited. Specifically, a clipmap is an image (usually black and white) that defines what is going to be drawn and what not (clipped). In graphics, for example, a clipmap can be used to show a hole in a solid. |
Cmesh
|
A cmesh (collision mesh) is a mesh with collision properties. |
Collision
|
A collision occurs when two objects in the 3D world collide or impact each other. The collision properties determine what happens when objects collide. |
Configurator (now XR Settings)
|
The Configurator tool enables you to set up and customize all the parameters belonging to an AVEVA XR engine-based application. The AVEVA XR engine XML-based configuration format is supported by both the rendering and the logic layers of the engine. |
Connected Worker
|
A Connected Worker application is a digital application that is executed on a portable or wearable device designed to assist a worker in field activities by providing data access, documentation, and remote actions. In this way, the application 'enhances' the worker capabilities. |
Context
|
A context is a logic group of one or more MWX files. When a context is active, the content of all its MWX files is visible in the scene. However, when the context is not active, its MWX content is not visible. |
CSAA
|
Coverage Sampling Anti-Aliasing (CSAA). |
DDS file format
|
The DirectDraw Surface (DDS) container file format is a Microsoft format for storing data compressed with the proprietary S3 Texture Compression (S3TC) algorithm, which can be decompressed in hardware by graphical processing units (GPUs). It is a highly optimized file format for textures. This makes the format useful for storing graphical textures and cubic environment maps as a data file, both compressed and uncompressed. |
Depth of field (DOF)
|
Depth of Field (DOF) is a photographic term for the range of distances within which objects will be sharply focused. Objects that are outside of this range will appear blurred or out of focus. |
Digital content creation (DCC)
|
Digital content creation (DCC) tools are a category of tools used for creation of electronic media. |
DirectX
|
Microsoft DirectX is a collection of application programming interfaces (APIs) for handling tasks related to multimedia, especially game programming and video, on Microsoft platforms. Originally, the names of these APIs all began with "Direct", such as Direct3D, DirectDraw, DirectMusic, DirectPlay, DirectSound, and so forth. The name DirectX was coined as a shorthand term for all of these APIs (the X standing in for the particular API names) and soon became the name of the collection. |
Document Object Model
|
The Document Object Model (DOM) is a cross-platform and language-independent interface that treats an XML or HTML document as a tree structure wherein each node is an object representing a part of the document. Each branch of the tree ends in a node, and each node contains objects. |
DPI
|
DPI stands for Dispositivi di Protezione Individuale, which is Italian for Personal Protective Equipment. |
Dummy
|
A dummy is an object that represents a position, orientation, and scale in 3D space. It cannot represent a geometry, but only a transformation matrix. The name is taken from the 3DS Max dummy object. |
DXF file format
|
AutoCAD DXF is a CAD data file format developed by Autodesk. |
Dynamic actor
|
A Dynamic Actor is a PhysX object that is dynamic, not static. |
DYNSIM
|
AVEVA™ Dynamic Simulation (formerly DYNSIM®) is a process simulator that combines first principles process models, and thermodynamic methods and data with an integrated graphical user interface (GUI) to produce highly accurate and rigorous dynamic simulation and high-fidelity control system emulation. |
Easing
|
Easing is the process of making animation movement more natural with acceleration or deceleration. In classic animation, the term for motion that starts slowly and accelerates is "slow in," and for motion that starts quickly and decelerates is "slow out." The terminology most commonly used for these are “ease in” and “ease out." |
Engineering Station (now XR Studio)
|
Engineering Station is the former name for XR Studio, the central configuration tool for creating XR project files. |
Euler angles
|
In geometry, the Euler angles are three angles that describe the orientation of a body with respect to a fixed coordinate system. Euler angles can be defined by elemental geometry or by composition of rotations. The geometrical definition demonstrates that three composed elemental rotations (rotations about the axes of a coordinate system) are always sufficient to reach any target frame. |
Event
|
When the properties of a node change, this generates an event. You can link the property of a (source) node to the property of another (target) node so that when the first changes and generates an event, the change propagates to the second node. This is called an event propagation chain. |
EYESIM
|
EYESIM is the commercial name of a product using the AVEVA XR engine and delivered with an OTS process simulator, as a representation of the virtual field. |
Face
|
A face is a surface made up of three or more edges, though edges can exist without a face being defined between them. All three edges must flow in the same direction around a face for it to be properly defined and the direction of this flow determines the direction of the face normal. |
Fast Approximate Anti-Aliasing (FXAA)
|
Fast Approximate Anti-Aliasing (FXAA). |
Fast Fourier transform (FFT)
|
A fast Fourier transform (FFT) is an algorithm that computes the discrete Fourier transform (DFT) of a sequence, or its inverse (IDFT). Fourier analysis converts a signal from its original domain (often time or space) to a representation in the frequency domain and vice versa. The DFT is obtained by decomposing a sequence of values into components of different frequencies. |
FBX file format
|
FBX (Filmbox) is a proprietary file format (.fbx) by Autodesk that provides interoperability between digital content creation applications. |
Field of View (FoV)
|
Field of View (FoV) is the horizontal angle of a camera view. The wider the Field of View, the greater the extent of the observable world that is seen at any given moment. |
Field Operated Device (FOD)
|
A Field Operated Device (FOD) refers to a manual item in an industrial plant that can be operated only in the physical field. |
Field operator
|
A Field operator is a real human who goes into the field to interact with the industrial plant. |
File texture
|
A file texture is a bitmap image that can be mapped to shading attributes. |
First-bounce Global Illumination (FGI)
|
First-bounce Global Illumination is a variant of Global Illumination. |
Fractal
|
A fractal is a three-dimensional random function with a particular frequency distribution. Fractal textures are useful for simulating many natural phenomena, such as rock surfaces, clouds, or flames. |
Fresnel
|
Fresnel effects in computer rendering refer to the increasing reflectivity of a surface the more acutely you view it. Normal to a metal plate, for example, there may be very little reflection, but if you look at a glancing angle along its surface the reflected image is now sharp and brightly colored. For a transparent material, such as glass, the material is more transparent the closer to normal one views it so that the edges are mirror-like while the center is clear. |
Frame rate
|
The frame rate is the speed at which the frames are played. They are generally calculated by frame per second. For example, a scene could be played back at 12, 24, 25, 30 or 60 frames per second. This is the measurement of the frequency (rate) at which an imaging device produces unique consecutive images, called frames. The term applies equally to computer graphics, video cameras, film cameras, and motion capture systems. Frame rate is most often expressed in Frames per Second (FPS) and in progressive-scan monitors as hertz (Hz). |
Frames per Second (FPS)
|
Frames per second (FPS) is the unit of measure of the frame rate. |
Frustrum
|
A frustrum is a volume of space that includes everything that is currently visible from a given camera viewpoint. A frustum is defined by planes arranged in the shape of a four-sided cone with dimensions defined by camera near, far, and FoV. |
GCD file format
|
Graphics Cache Data (GCD) is a drawing file format created by Generic CADD. GCD files are used for holding drawings of building floor plans, layouts, and other architectural designs created with Generic CADD. |
Geometry
|
A geometry is a mathematical description of a 3D object. |
Global Illumination
|
Global illumination, or indirect illumination, is the name for a group of algorithms used in 3D computer graphics that add more realistic lighting to 3D scenes. Such algorithms take into account not only the light that comes directly from a light source (direct illumination), but also subsequent cases in which light rays from the same source are reflected by other surfaces in the scene, whether reflective or not (indirect illumination). |
Graphics Processing Unit (GPU)
|
A graphics processing unit (GPU) is a specialized electronic circuit that rapidly manipulates and alters memory to accelerate the creation of images in a frame buffer intended for output to a display device. |
Graphic Context (GC)
|
A graphic context consists of a set of 3D contents, provided by one or more MWX, and the definition of all the nodes that are connected to the 3D content, such as items, hotspots, layers, sounds, cameras. |
Heads Up Display (HUD)
|
In video gaming, the HUD (heads up display) or status bar is the method by which information is visually relayed to the player as part of a game's user interface. |
HoloLens
|
Microsoft HoloLens 2 is a pair of mixed reality smartglasses developed and manufactured by Microsoft. It is the successor to the pioneering Microsoft HoloLens. |
Horizon Based Ambient Occlusion (HBAO)
|
Horizon Based Ambient Occlusion (HBAO) is a variant of Ambient Occlusion (AO). It is a post processing image effect that adds realism to scenes by accentuating small surface details and reproducing light attenuation due to occlusion. |
Head-mounted Display (HMD)
|
A head-mounted display (HMD) is a display device, worn on the head or as part of a helmet that has a small display optic in front of one (monocular HMD) or each eye (binocular HMD). An HMD has many uses including gaming, aviation, engineering, and medicine. |
Heightmap
|
A heightmap is a 2D array or texture map holding height values. It is typically used for defining landscapes, or for displacement mapping. |
High Dynamic Range (HDR)
|
High dynamic range (HDR) is the rendering of computer graphics scenes by using lighting calculations done in high dynamic range (HDR). This enables the preservation of details that may be lost due to limiting contrast ratios. Within AVEVA XR framework, HDR refers to a rendering technique that adapts image brightness for three reasons: bright things can be really bright, dark things can be really dark, and details can be seen in both. |
Hololens
|
Microsoft HoloLens 2 is a pair of mixed reality smartglasses developed and manufactured by Microsoft. It is the successor to the pioneering Microsoft HoloLens. |
HTC Vive
|
The HTC Vive is a virtual reality headset developed by HTC and Valve. The headset uses "room scale" tracking technology, allowing the user to move in 3D space and use motion-tracked handheld controllers to interact with the environment. |
Human-Machine Interface (HMI)
|
A Human-Machine Interface (HMI) is a user interface or dashboard that connects a person to a machine, system, or device. |
Hue, Saturation, and Value (HSV)
|
Hue, Saturation, and Value is a color mode that determines the shading and tint of a color. Hue corresponds to the pure color; saturation to the amount of white mixed with the hue; and value to the amount of black mixed with the hue. |
Initial Condition (IC)
|
The initial condition (IC) represents the initial state of a system, such as an industrial plant. |
Instructor Station (now XR Instructor)
|
The Instructor Station is the former name for XR Instructor, an AVEVA XR application for training use. An instructor can use this software to run training courses and oversee a group of students who are learning in simulation sessions. |
Integrated Training Simulator (ITS)
|
An Integrated Training Simulator (ITS) is an extended version of the OTS (Operator Training Simulator). The ITS usually combines Process Simulator and Control Room with a simulated interactive Virtual Field that enables collaborative training scenarios involving field and control operators. |
Interpolation
|
In the context of computer animation, interpolation is inbetweening, or filling in frames between the keyframes. You have the choice to create interpolation, or not, between keyframes. |
Keyframe
|
A keyframe is a computer-generated position at a specific moment (frame) on a given trajectory. Important positions in the action defining the starting and ending points of any action correspond to keyframes. |
KTX file format
|
A Khronos Texture (KTX) is a format for storing textures for OpenGL and OpenGL ES applications. It is distinguished by the simplicity of the loader required to instantiate a GL texture object from the file contents. This makes KTX a texture storage format that is easy to load into GPU. |
LAAS
|
AVEVA Licensing System (ALS) in the cloud (LAAS). |
Launcher (now XR Launcher)
|
The Launcher tool enables you to create unique starting points for launching different AVEVA XR projects, or different configurations of the same AVEVA XR project. |
Level of Detail (LOD)
|
Level of Detail (LOD) is a system used to handle high definition models in a real-time environment. Models can have multiple instances with varying levels of detail that are replaced, based on distance to camera. |
Local coordinates
|
In 3D modeling, local coordinates refers to the XYZ coordinates that are specific to an object, and that change when that object is moved or rotated. |
Light probe
|
A Light probe or FGI probe is the probe used for FGI rendering pipeline lighting pre-calculations. |
LiteHMI (now XR 2D Interface)
|
LiteHMI is the former name for an AVEVA XR application that provides for fast development of simple, yet functional HMI interfaces. LiteHMI applications can natively communicate with other AVEVA XR engine real-time applications components through the message bus. |
Learning Management System (LMS)
|
A Learning Management System (LMS) is a software application for the delivery of educational courses, training programs, or learning programs. |
Maya
|
Maya is a 3D computer animation, modeling, simulation and rendering software made by AutoDesk. |
Mesh
|
A mesh is a generic term for a polygonal surface defined by a series of vertices, edges, and triangles, that can be open or closed. |
MipMaps
|
MipMaps (also MIP maps) are pre-calculated, optimized sequences of images, each of which is a progressively lower resolution representation of the same image. They are designed to increase rendering speed and reduce aliasing artifacts. |
Model
|
A model is computer-based description and representation of a three-dimensional object. |
Model View Projection (MVP)
|
A Model View Projection (MVP) matrix is used to calculate the position on the view of a given position in the 3D World. |
Multi-Sampled Anti-Aliasing (MSAA)
|
Multi-Sampled Anti-Aliasing (MSAA) is a type of spatial anti-aliasing. It's a technique used in computer graphics to improve image quality. |
MWX file format
|
MWX is a proprietary AVEVA XR file format used to define objects that enable the XR engine to manage 3D elements in the scene. It supports models, materials, textures, and animations. The MWX is an XML file type that contains the definition of a set of objects that are part of a 3D scene together with their properties. The raw data of these scene objects is stored into binary files (such as .geom, .animk, .animg). These file types are proprietary file formats and considered as part of the MWX format. |
MWX soup
|
MWX Soup contains highly optimized, but unstructured geometry elements for use in AVEVA XR. An MWX file can contain two different types of content: (1) standard meshes with standard geometries, or (2) soup meshes. In some cases, these can both be mixed in the same file. |
Natural Features Tracking (NFT)
|
Natural feature tracking (NFT) is an object tracking method that utilizes natural interest points of an object to track, such as texture-patches, edges, color blobs. Physical objects can be represented by storing those interest points as feature descriptors in a feature map. |
Node
|
A Node is a building block of the XR scripting language. The AVEVA XR engine has a great number of nodes of different nature. All nodes share the same declaration rules. Some examples of the functions that nodes can perform include: Logic nodes, Connection nodes, Command Nodes, Scene nodes and so on. Each node is characterized by a certain number of parameters and attributes, and by rules that define its hierarchical positioning. Some nodes must reside in other nodes, while other nodes can be put almost everywhere. Each node field can contain an explicit value inside a node declaration tag. If a parameter has a value in a certain way inside the node declaration, this means the parameter assumes the given value during script loading. These are called start values. |
Non-Playing Character (NPC)
|
A non-playing character (NPC) is any character in a game that is not controlled by a player. |
NVIDIA 3D Vision
|
NVIDIA 3D Vision is a stereoscopic gaming kit from NVIDIA, which consists of liquid crystal (LC) shutter glasses and driver software that enables stereoscopic vision for any Direct3D game. |
NVIDIA Multi-GPU
|
NVIDIA Multi-GPU configuration is a technology for simultaneously running multiple graphics cards to improve rendering performance in Direct3D and OpenGL applications. |
NVIDIA SLI
|
Scalable Link Interface (SLI) is a brand name for a multi-GPU technology developed by NVIDIA for linking two or more video cards together to produce a single output. SLI is a parallel processing algorithm for computer graphics that is designed to increase available processing power. |
NVIDIA Stereo
|
The NVIDIA 3D Vision Stereo driver supported OpenGL's Quad-buffered Stereo. In OpenGL, active stereo is explicitly supported through quad buffered stereo as part of the OpenGL API. Quad buffered stereo has the advantage of enabling stereo-in-a-window while Direct3D APIs do not explicitly support quad buffered stereo. |
Object Space
|
Object Space refers to the coordinate system in a 3D scene, derived from an object's local coordinates. |
OBJ file format
|
OBJ (or .OBJ) is a geometry definition file format first developed by Wavefront Technologies for its Advanced Visualizer animation package. The file format is open and has been adopted by other 3D graphics application vendors. The OBJ file format is a simple data-format that represents 3D geometry. It includes the position of each vertex, the UV position of each texture coordinate vertex, vertex normals, and the faces that make each polygon defined as a list of vertices, and texture vertices. |
Oculus Touch
|
Oculus Touch is the motion controller system used by Oculus VR in their Rift, Oculus Rift S, and Quest virtual reality systems. Two iterations of the controllers have been developed; the first for use in the original Oculus Rift, which uses external tracking, and the second one for use with the Rift S and the Oculus Quest, which use inside-out tracking. |
Oculus Rift
|
Oculus Rift is a lineup of virtual reality headsets developed and manufactured by Oculus VR, a division of Facebook Inc. |
OGG file format
|
An OGG file is a compressed audio file that uses free, unpatented OGG Vorbis audio compression. The Ogg format is a free, open container format that was created by Xiph.Org Foundation and is designed to provide efficient streaming and manipulation of high-quality digital multimedia. The Ogg container format can multiplex a number of independent streams for audio, video, text (such as subtitles), and metadata. |
On Screen Display (OSD)
|
An On Screen Display (OSD) is an image or text that is superimposed over a screen picture to display information. |
OpenGL
|
OpenGL (Open Graphics Library) is a cross-language, cross-platform application programming interface (API) for rendering 2D and 3D vector graphics. The API is typically used to interact with a graphics processing unit (GPU) to achieve hardware-accelerated rendering. |
Open VR
|
OpenVR is a software development kit (SDK) and application programming interface developed by Valve for supporting the SteamVR (HTC Vive) and other virtual reality headset (VR) devices. The SteamVR platform uses it as the default application programming interface (API) and runtime. It serves as the interface between the VR hardware and software and is implemented by SteamVR. |
Operator Training Simulator (OTS)
|
The Operator Training Simulator (OTS) is a system made of a process simulator (DYNSIM) and an emulated/simulated Control Room (FSIM/InTouch). The EYESIM product can be coupled to an Operator Training Simulator to extend it by providing a simulated interactive virtual field. |
Oriented Bounding Box
|
An oriented bounding box (OBB) is a type of bounding box. In the case where an object has its own local coordinate system, it can be useful to store a bounding box relative to these axes. |
Overlay
|
An overlay is part of the scene environment, such as a chair or a plant, that is placed in front of the main animation. |
PAK file
|
A PAK file contains an archive of application data. |
Particle
|
A particle is a point displayed as a dot, streak, sphere, or other effect. You can animate the display and movement of particles with various techniques. Typically, particles are used in large quantities to create effects like rain and explosions. |
Particle System
|
A particle system is a means of distributing particles in space and time. Particle systems are used to simulate sparks, explosions, fire, and fluids. |
Personal Protective Equipment (PPE)
|
Personal protective equipment (PPE) is protective clothing, helmets, goggles, or other garments and equipment that is designed to protect the wearer's body from injury or infection. The hazards addressed by protective equipment include physical, electrical, heat, chemicals, biohazards, and airborne particulate matter. Protective equipment is worn for job-related occupational safety and health purposes. |
Pipeline
|
A pipeline in computer graphics (also called a rendering pipeline or graphics pipeline) is a conceptual model that describes what steps a graphics system needs to perform to render a 3D scene to a 2D screen. |
PlantView (now XR TabletViewer)
|
PlantView is the former name for XR TabletViewer, an AVEVA XR engine runtime component that implements field-related applications. XR TabletViewer can communicate over the message bus with other AVEVA XR runtime nodes to display real-time data and enable real-time interaction with systems. XR TabletViewer is designed to be used on a mobile device running Windows. |
Player (now XR Viewer)
|
Player is the former name for XR Viewer, an AVEVA XR application that runs the 3D simulation. There is a XR Viewer Runtime Debug Tool of the Player for developers to observe and identify any problems in the visualization. |
Polygon
|
A polygon is an industry standard for constructing geometry. Polygons are always internally considered as a sum of triangles and often referred to as faces of a mesh. |
Point of View (POV)
|
Point of View (POV) refers to a graphical perspective rendered from the viewpoint of the player character. |
Pose estimate
|
Pose estimation refers to computer vision techniques that estimate an object's position based on tracking of features on the visual input. Pose estimation makes Augmented Reality (AR) possible by assigning a position in the 3D World to a given object based on the camera input and the tracking of the real World. |
Point cloud
|
A point cloud is a set of data points in space. |
Quadtree
|
A quadtree is a tree data structure in which each internal node has exactly four children. Quadtrees are the two-dimensional analog of octrees and are most often used to partition a two-dimensional space by recursively subdividing it into four quadrants or regions. |
Quaternion
|
In mathematics, the quaternions are a number system that extends the complex numbers. Quaternions are used in pure mathematics, and also have practical uses in applied mathematics—in particular for calculations involving three-dimensional rotations such as in three-dimensional computer graphics. |
Radiosity
|
Radiosity is a global illumination algorithm where the illumination arriving on a surface comes not just directly from the light sources, but also from other surfaces reflecting light. |
Reflection
|
Reflection in computer graphics is used to emulate reflective objects like mirrors and shiny surfaces. Reflection on a shiny surface like wood or tile can add to the photo-realistic effects of a 3D rendering. |
Refraction
|
Refraction is the change in direction of a wave due to a change in its velocity. This is most commonly seen when a wave passes from one medium to another. Refraction of light is the most commonly seen example, but any type of wave can refract when it interacts with a medium, for example when sound waves pass from one medium into another. |
REST
|
REST (Representational state transfer) is a software architectural style that defines a set of constraints for creating Web services. Web services that conform to the REST architectural style provide interoperability between computer systems. |
RFC file format
|
An RFC file contains game content. It stores information about content that appears during gameplay. RFC files include information about the appearance and behavior of the content. |
Rotation Gizmo
|
When in Rotate mode in the 3D View of Graphic Context Editor, the Transformation Gizmo becomes the Rotation Gizmo. This gizmo is for rotating nodes to change the orientation values.
|
RMS
|
In mathematics, the root mean square (RMS) is defined as the square root of the mean square (the arithmetic mean of the squares of a set of numbers). The RMS is also known as the quadratic mean and is a particular case of the generalized mean with exponent. |
RVM file format
|
RVM data files are related to AVEVA PDMS. RVM file is an AVEVA Plant Design Management System Model. The AVEVA Plant Design Management System (PDMS) is a 3D design system using data-centric technology for managing projects. |
Scale Gizmo
|
When in Scale mode in the 3D View of Graphic Context Editor, the Transformation Gizmo becomes the Scale Gizmo. This gizmo is for scaling nodes to change the size values.
|
Scene
|
A scene is a file containing all the information necessary to identify and position all of the models, lights, and cameras for rendering. A scene can be identified with the 3D coordinate space in which rendering takes place. This space is often called the global coordinate space, as opposed to the local coordinate spaces associated with each individual object in the scene. |
SFX
|
Sound Special Effect (SFX) |
Shader
|
A shader combines the material, texture, lighting, and shadow information at a pixel level in the rendered image to create a final result that may appear shiny, matte, cartoon shaded, photo-realistic, or translucent. Shaders have a major impact on how objects look after rendering. |
Shadow map
|
A shadow map is a texture buffer holding depth values that is rendered in a separate render pass from the perspective of a lightsource. It is typically rendered onto other geometry. |
SLAM (Simultaneous Localization and Mapping)
|
SLAM (Simultaneous Localization and Mapping) is a technology that understands the physical world through feature points. This makes it possible for AR applications to recognize 3D objects and scenes, as well as to instantly track the world, and to overlay digital interactive augmentations. |
Slerp
|
In computer graphics, Slerp is shorthand for spherical linear interpolation in the context of quaternion interpolation for the purpose of animating 3D rotation. It refers to constant-speed motion along a unit-radius great circle arc, given the ends and an interpolation parameter between 0 and 1. |
Solid Pointcloud™ (SPC)
|
Solid Pointcloud™ (SPC) is a low-hardware way of rendering point cloud data into visually solid surfaces. This enables the user to understand an entire site on-screen at once, then dive down deeper into an area of interest. This concept is part of AVEVA™ Point Cloud. |
Soup (MWX soup)
|
MWX Soup contains highly optimized, but unstructured geometry elements for use in AVEVA XR. An MWX file can contain two different types of content: (1) standard meshes with standard geometries, or (2) soup meshes. In some cases, these can both be mixed in the same file. |
Specular
|
Specular describes the shiny quality of a surface as shown by highlights on an object. |
Static Actor
|
A Static Actor is a PhysX object that is static. |
Storyboard
|
A storyboard shows a sequence of events, such as activity flow. |
Subpixel Morphological Antialiasing (SMAA)
|
Subpixel Morphological Antialiasing (SMAA) is an image-based GPU-based implementation of Morphological Antialiasing. It is a technique for minimizing the distortion artifacts known as aliasing when representing a high-resolution image at a lower resolution. |
Texture
|
In 3D graphics, a texture is the digital representation of the surface of an object. In addition to two-dimensional qualities, such as color and brightness, a texture is also encoded with three-dimensional properties, such as how transparent and reflective the object is. Once a texture has been defined, it can be wrapped around any 3-dimensional object. This process is called texture mapping. |
Texture map
|
A texture map is a bitmap image or rendering resource used in texture mapping, applied to 3D models and indexed by UV mapping for 3D rendering. |
TGA file format
|
The TGA format is a format for describing bitmap images. It is capable of representing bitmaps ranging from black and white, indexed colour, and RGB color. The format also supports various compression methods. A TGA file is saved in the raster graphic format designed by Truevision. It supports 8, 16, 24, or 32 bits per pixel at a maximum of 24 bits for RGB colors and 8-bit alpha channel. TGA files are used for various types of images, such as digital photos and textures referenced by 3D video games. TGA files are common in animation and video industry. |
Tone mapping
|
Tone mapping is a technique used in image processing and computer graphics to map one set of colors to another to approximate the appearance of high-dynamic-range images in a medium that has a more limited dynamic range. |
TPM
|
Training Performance Monitoring (TPM) is a performance evaluation scoring system, usually from an external system. |
Transformation Gizmo
|
The Transformation Gizmo in the 3D View of the Graphic Context Editor is used to edit space transformations of objects in the scene. Depending on what mode is active, the gizmo becomes the Translation Gizmo (Move mode), the Rotation Gizmo (Rotate mode), or the Scale Gizmo (Scale mode). The generic name for these three types of gizmos together is Transformation Gizmo. |
Transformation matrix
|
A transformation matrix is a specific application of matrices that enables transformation from one space to another. When matrices are used this way, they are called transformation matrices. The transformation matrix facilitates a mapping between spaces. |
Translation Gizmo
|
When in Move mode in the 3D View of Graphic Context Editor, the Transformation Gizmo becomes the Translation Gizmo. This gizmo is for moving nodes to change the position values.
|
User Datagram Protocol (UDP)
|
User Datagram Protocol (UDP) is an alternative communications protocol to Transmission Control Protocol (TCP) used primarily for establishing low-latency and loss-tolerating connections between applications on the internet. |
Unit of Measure (UOM)
|
Unit of Measure (UOM) is a definite magnitude of a quantity, defined and adopted by convention or by law, that is used as a standard for measurement of the same kind of quantity. Any other quantity of that kind can be expressed as a multiple of the unit of measurement. |
UTF-8
|
UTF-8 is variable-width character encoding for electronic communication. Defined by the Unicode Standard, the name is derived from Unicode (or Universal Coded Character Set) Transformation Format – 8-bit. |
UV mapping
|
UV mapping is the 3D modeling process of projecting a 2D image to a 3D model's surface for texture mapping. The letters U and V denote the axes of the 2D texture because X, Y and Z are already used to denote the axes of the 3D object in model space. |
Velocity
|
In animation, the velocity (also known as ease) is the acceleration or deceleration of a motion. This can be achieved by a function curve, or via a series of animated drawings. Other common terms for ease-in and ease-out are slow-in and slow-out. |
Vertex
|
A vertex is a point in 3D space defined by XYZ coordinates. Vertices differ from simple points by having a normal direction (which is parallel to the face they are part of) and a color. Vertices exist independently of edges. |
Viewport
|
In computer graphics, a viewport is a viewing region or window of the graphical world. Sometimes, the viewport is larger than the available screen. In such cases, the user needs to scroll to see the entire viewport. |
Virtual Reality (VR)
|
Virtual reality (VR) is an interactive computer-generated experience taking place within a simulated environment. |
World axis
|
The world axis is the XYZ axis at the center of World space. |
World coordinates
|
World coordinates refers to the unique coordinate space inside a scene. The universal coordinate system enables individual models to interact with each other. Also called global coordinates. |
World space
|
World space is the space inside a 3D scene that is defined by World coordinates. The universal coordinate system enables individual models to interact with each other. |
World units
|
World units are the units of measure used in World Space. |
WRL file format
|
WRL is a file extension for a Virtual Reality Modeling Language (VRML) file format used by browsers to display virtual reality environments. |
XR 2D Interface
|
XR 2D Interface (formerly LiteHMI) is a light version of HMI. The LiteHMI application displays one HMI page at a time. |
XR Bridge
|
XR Bridge, formerly the Bridge Manager handles communications channels between various components of the AVEVA XR system and/or external applications. |
XR Instructor
|
XR Instructor (formerly Instructor Station) is an AVEVA XR application for training use. An instructor can use this software to run training courses and oversee a group of students who are learning in simulation sessions. |
XR Launcher
|
XR Launcher is a tool that creates unique starting points for launching different projects or different configurations of the same project. |
XR Login
|
XR Login is part of the Session Management System that is an entry point for training or assessment activities. |
XR Settings
|
XR Settings (formerly the Configurator tool) enables you to set up and customize all the parameters belonging to an AVEVA XR engine-based application. The AVEVA XR engine XML-based configuration format is supported by both the rendering and the logic layers of the engine. |
XR Studio
|
XR Studio (formerly the Engineering Station) is the central configuration tool for creating XR project files. |
XR TabletViewer
|
XR TabletViewer is an AVEVA XR engine runtime component that implements field-related applications. XR TabletViewer can communicate over the message bus with other AVEVA XR runtime nodes to display real-time data and enable real-time interaction with systems. XR TabletViewer is designed to be used on a mobile device running Windows. |
XR Viewer
|
XR Viewer (formerly Player) is the AVEVA XR application that runs the 3D simulation. There is a XR Viewer Runtime Debug Tool for developers to observe and identify any problems in the visualization. |
Z buffer
|
The Z buffer refers to the local Z axis of a camera, and is a channel of information in an image format that stores the distance of each rendered pixel from the camera plane. Z channel information is most often used to control Depth of Field (DOF) (focusing) on particular elements in the scene. It can also be used to define mist. |
Z-fighting
|
Z-fighting, also called stitching, is a phenomenon in 3D rendering that occurs when two or more primitives have similar or identical values in the z-buffer. |
ZGL format
|
ZGL is compressed version of XGL Format. The XGL file format is designed to represent 3D information for the purpose of visualization. It attempts to capture all of the 3D information that can be rendered by SGI's OpenGL rendering library. |


