Skip to content

Commit cd48339

Browse files
committed
AI glossary first submission
1 parent ae59c8d commit cd48339

File tree

1 file changed

+33
-0
lines changed
  • supplementary_style_guide/glossary_terms_conventions/general_conventions

1 file changed

+33
-0
lines changed

supplementary_style_guide/glossary_terms_conventions/general_conventions/i.adoc

Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -348,6 +348,17 @@ There is no functional difference between the first server that was installed an
348348

349349
*See also*: xref:bucket-index[bucket index]
350350

351+
[[inference]]
352+
==== image:images/yes.png[yes] inference (noun)
353+
*Description*: _Inference_ is when a model can process, deduce, and produce outputs based on input data.
354+
355+
*Use it*: yes
356+
357+
[.vale-ignore]
358+
*Incorrect forms*:
359+
360+
*See also*:
361+
351362
[[inference-engine]]
352363
==== image:images/yes.png[yes] inference engine (noun)
353364
*Description*: In Red{nbsp}Hat Process Automation Manager and Red{nbsp}Hat Decision Manager, the _inference engine_ is a part of the Red{nbsp}Hat Decision Manager engine, which matches production facts and data to rules. It is often called the brain of a production rules system because it is able to scale to a large number of rules and facts. It makes inferences based on its existing knowledge and performs the actions based on what it infers from the information.
@@ -359,6 +370,28 @@ There is no functional difference between the first server that was installed an
359370

360371
*See also*:
361372

373+
[[inferenceservice]]
374+
==== image:images/yes.png[yes] InferenceService (noun)
375+
*Description*: In Red Hat OpenShift AI, this is the custom resource definition (CRD) used to create the `InferenceService` object. Written as the CRD name `InferenceService` in monospace.
376+
377+
*Use it*: yes
378+
379+
[.vale-ignore]
380+
*Incorrect forms*: InferenceService, inference serving
381+
382+
*See also*:
383+
384+
[[inference-serving]]
385+
==== image:images/yes.png[yes] inference serving (verb)
386+
*Description*: _Inference serving_ is the process of deploying a model onto a server. Use as separate words, for example, "The following charts display the minimum hardware requirements for inference serving a model".
387+
388+
*Use it*: yes
389+
390+
[.vale-ignore]
391+
*Incorrect forms*:
392+
393+
*See also*:
394+
362395
[[infiniband]]
363396
==== image:images/yes.png[yes] InfiniBand (noun)
364397
*Description*: _InfiniBand_ is a switched fabric network topology used in high-performance computing. The term is both a service mark and a trademark of the InfiniBand Trade Association. Their rules for using the mark are standard ones: append the (TM) symbol the first time it is used, and respect the capitalization (including the inter-capped "B") from then on. In ASCII-only circumstances, the "\(TM)" string is the acceptable alternative.

0 commit comments

Comments
 (0)