Generative AI
Definition
A category of artificial intelligence systems capable of creating new content — including text, images, code, music, and video — based on patterns learned from training data. Generative AI is transforming content production, product design, and software development, raising novel questions about intellectual property ownership and the valuation of AI-generated outputs.
Complementary Terms
Concepts that frequently appear alongside Generative AI in practice.
A type of neural network trained on vast corpora of text data, capable of generating human-like text, answering questions, summarising documents, and performing reasoning tasks. Large language models such as GPT and Claude represent significant R&D investment and are reshaping knowledge work, customer service, and content production across industries.
A field of artificial intelligence that enables machines to interpret and extract information from visual inputs such as images, video, and documents. Computer vision is applied in quality inspection, medical imaging, autonomous vehicles, and document processing.
An economic system in which growth and value creation are driven primarily by the production, distribution, and application of knowledge and information rather than physical goods. In the knowledge economy, intangible assets — including human capital, software, data, and intellectual property — constitute the majority of enterprise and national wealth.
Artificial intelligence systems designed to provide human-interpretable explanations of their decision-making processes and outputs. Explainability is increasingly required by regulators — particularly in financial services, healthcare, and criminal justice — and is a key differentiator for AI products seeking enterprise adoption in regulated industries.
Legal rights that grant the creator of original works exclusive control over their reproduction, distribution, and adaptation. In a business context, copyrights protect software code, written content, marketing materials, training programmes, and creative works as intangible assets.
The process of breaking text, code, or other sequential data into discrete units (tokens) that serve as the input and output elements for large language models. Tokenisation determines how a model processes language and directly affects inference costs, since API pricing for large language models is typically based on token count.
An approach to systems engineering and product development that embeds data protection principles into the design and architecture of IT systems and business practices from the outset, rather than retrofitting them. Privacy by Design is codified as a legal requirement under GDPR Article 25 and encompasses data minimisation, pseudonymisation, and purpose limitation as default settings.
The degradation in a machine learning model's predictive accuracy over time as the statistical properties of the input data diverge from the training data distribution. Model drift requires ongoing monitoring and periodic retraining to maintain performance, and is a key operational risk in production AI systems.
Put this knowledge to work
Use Opagio's free tools to measure and grow the intangible assets that drive your business value.