Vector Embeddings
Vector embeddings are numerical representations of text, images, or other data that capture semantic meaning, enabling machines to understand similarity and relationships between pieces of information.
Understanding Vector Embeddings
When text is converted into a vector embedding, its meaning is encoded as a list of numbers, typically hundreds or thousands of dimensions. Similar concepts end up close together in this numerical space. The phrase 'schedule a meeting' would be close to 'book a call' but far from 'buy groceries.' This property makes vector embeddings essential for semantic search, where you find information by meaning rather than exact keyword matches. Vector databases store these embeddings and enable fast similarity searches across millions of data points.
How GAIA Uses Vector Embeddings
GAIA uses ChromaDB as its vector database to store embeddings of your emails, tasks, notes, and documents. When you ask GAIA to find information or when the agent needs context for a task, it performs semantic search across your embedded data. This means you can ask 'find that email about the Q3 budget review' and GAIA will find it even if the email subject line says 'Financial Planning Discussion.' The semantic understanding goes beyond keyword matching.
Related Concepts
Semantic Search
Semantic search is a search technique that understands the meaning and intent behind a query, returning results based on conceptual relevance rather than exact keyword matches.
Graph-Based Memory
Graph-based memory is an AI memory architecture that stores information as interconnected nodes and relationships, enabling rich contextual understanding and persistent knowledge across interactions.
Knowledge Graph
A knowledge graph is a structured representation of information that organizes data as entities, their attributes, and the relationships between them, enabling machines to understand and reason about connected information.
Large Language Model (LLM)
A Large Language Model (LLM) is an artificial intelligence model trained on vast amounts of text data that can understand, generate, and reason about human language with remarkable fluency.
Frequently Asked Questions
Why are vector embeddings important for AI?
Vector embeddings allow AI to understand meaning rather than just matching keywords. GAIA uses embeddings to search your emails, tasks, and documents by meaning, so you can find information even when you do not remember exact words.
What is a vector database?
A vector database stores and indexes vector embeddings for fast similarity search. GAIA uses ChromaDB as its vector database, enabling semantic search across all your connected data sources.

