# Vector Symbolic Architectures

Vector Symbolic Architectures (VSA) combine a hypervector space and a set of operations on these vectors. Hypervectors provide powerful and noise-robust representations and VSAs are associated with promising theoretical properties for approaching high-level cognitive tasks.

## Tutorials on Vector Symbolic Architectures

- We organized a free online tutorial on VSAs during the European Conference on Artificial Intelligence (ECAI) in 2020. More information, slides and supplemental material can be found on our tutorial website.

- We organized a half-day tutorial on VSAs during the German Conference on Artifical Intelligence (KI) in 2019. More information, slides and supplemental material can be found on our tutorial website.

## Our work on Vector Symbolic Architectures

An introduction to our work on VSAs was recently given by Peer Neubert during the Online Webinars on Developments in Hyperdimensional Computing and Vector-Symbolic Architectures:

### References

- 2019) An Introduction to Hyperdimensional Computing for Robotics. KI - Künstliche Intelligenz, Special Issue: Reintegrating Artificial Intelligence and Robotics, Vol. 33. DOI: 10.1007/s13218-019-00623-z (
- 2020) A comparison of Vector Symbolic Architectures. In arXiv:2001.11797 (
- 2016) Learning Vector Symbolic Architectures for Reactive Robot Behaviours. In Proc. of Intl. Conf. on Intelligent Robots and Systems (IROS) Workshop on Machine Learning Methods for High-Level Cognitive Capabilities in Robotics (
- 2019) A neurologically inspired sequence processing model for mobile robot place recognition. In IEEE Robotics and Automation Letters (RA-L) and presentation at Intl. Conf. on Intelligent Robots and Systems (IROS). DOI: 10.1109/LRA.2019.2927096 (
- 2019) Towards combining a neocortex model with entorhinal grid cells for mobile robot localization. In Proc. of European Conference on Mobile Robotics (ECMR). DOI: 10.1109/ECMR.2019.8870939 (

## An introduction to Vector Symbolic Architectures

Vector Symbolic Architectures (VSA) combine a hypervector space and a set of operations on these vectors. Hypervectors provide powerful and noise-robust representations and VSAs are associated with promising theoretical properties for approaching high-level cognitive tasks.

### Hypervectors - The upside of the curse of dimensionality

Hypervectors are high dimensional vectors, typically with more than 1,000 dimensions. They can be sparse or dense, binary, ternary, or real-valued. Who ever tried something like a nearest-neighbour classifier in spaces with more than a few (e.g. 10) dimensions, likely encountered the curse of dimensionality: algorithms that work well in low dimensional spaces often fail when faced with high dimensional data. So, high dimensional spaces are bad and should always be avoided? Not at all. High dimensional representations are omnipresent, in particular in massively parallel biological processing systems and there are at least two properties that can be exploited in technical (i.e. algorithmic) applications:- The capacity of a hypervector is enormous (even if it's sparse and binary).
- The number of almost orthogonal vectors is exponential in the number of dimensions.

### VSA = Hypervector Space + Operations

In 2003, Ross Gayler established the term Vector Symbolic Architecture (VSA) for the combination of a high dimensional space and a certain set of operations with well defined properties. The set of operations includes at least:- ⊗ to bind hypervevtors
- ⊕ to bundle hypervectors

X⊗X=I

This is, the result of binding a hypervector with itself is the identity element I. Then, the bind() operator can, e.g., be used to create role-filler pairs. Given hypervectors X and A, we can assign the value "A" to the variable "X" by using the bind operator:H = A⊗B

To query the value of the variable X, we can apply the bind operator on the hypervector H:X⊗H = X⊗(X⊗A) = (X⊗X)⊗A = I⊗A = A

In combination with a distributive bundle operator, we can also query a variable value from a bundle of role-filler pairs, e.g. variable hypervectors X and Y and value hypervectors A and B:H = X⊗A ⊕ Y⊗B

Let me emphasize the fact that H is an element of the same hypervector space as variables and values. To query the value of variable X, we proceed as above, we query the result vector H with the query variable X:X⊗H = X ⊗(X⊗A ⊕ Y⊗B) = (X ⊗X⊗A) ⊕ (X ⊗Y⊗B) = A ⊕ noise

"noise" means that the result of (X ⊗Y⊗B) is not similar of anything known (which are in this example X, Y, A, B, X⊗A, Y⊗B, and H). In fact it is very likely to be almost orthogonal to any other known vector - a result of the above listed second property of the underlying hypervector space.To solve a task using a VSA, further elements like a clean-up memory, encoder/decoders, and further operators (e.g., a protect operator) are required. However, based on VSA, there are surprisingly elegant ways of implementing solutions to various problems: E.g. think of a algorithm for mobile robot navigation, where the sensor input, the actuator output and the whole program that maps the first to the latter, are elements from the very same hypervector space.