Please ensure Javascript is enabled for purposes of website accessibility
Powered by Zoomin Software. For more details please contactZoomin

CONNECT visualization

How Industrial AI Assistant works

  • Last UpdatedSep 11, 2025
  • 2 minute read

The architecture that supports Industrial AI Assistant is based on three key components:

  • CONNECT visualization

    Your data is securely hosted in CONNECT. This is the same secure repository used for all other data shown in CONNECT visualization.

    Visualization services also hosts the Industrial AI Assistant interface where you can make requests and ask questions.

  • Industrial AI Assistant

    The programmed application that processes your requests and manages interactions with CONNECT visualization and the large language model.

  • Large Language Model (LLM)

    Supports the functionality of the assistant by providing intelligent assessment of a request, which is translated into a series of search and data queries. The LLM also allows the results to be summarized within the context of the request in order to provide the most relevant information back to the user.

When you enter a request into the assistant, the following process occurs.

  1. Industrial AI Assistant receives the request and prepares the required information for the LLM. This involves a combination of:

    1. Applying guardrail instructions that prevent unethical and irrelevant questions being answered.

    2. Preparing a list of available tools and information that can be used to reply to the request. This is based on the solutions you have available in CONNECT visualization.

  2. The LLM then receives the information prepared by the assistant and performs two functions:

    1. Guardrail assessment. If a question is inappropriate, a response will be generated at this point to close the request.

    2. Intent analysis. The LLM looks at the language used in the request and determines the type of information that is required.

  3. Industrial AI Assistant applies the request to the information available in CONNECT data services and summarizes the relevant content.

  4. The LLM receives the summarized information and responds with an answer to the request.

    How the Industrial AI Assistant Works 2025 R5

This architecture allows your data to be handled in the following ways:

  • User requests (the text you enter in the assistant) may be monitored to troubleshoot errors and improve responses where user feedback is provided. The conversation history is used by the assistant to process follow up questions. This is stored for up to 30 days so you can return to a conversation, but beyond this point it is not retained.

  • Operational information (such as raw production data and documents) remains stored in CONNECT visualization. It is only accessed by the assistant when summarized data is collated and it is not retained.

  • Summarized data (the data received by the LLM to process responses) is not retained.

TitleResults for “How to create a CRG?”Also Available in