Where to start if you are new to the area?
We recommend to start the exploration of the area by reading the introductory papers below:
P. Neubert, S. Schubert, and P. Protzel, “An Introduction to Hyperdimensional Computing for Robotics,” KI - Künstliche Intelligenz, vol. 33, no. 4, pp. 319–330, 2019.
P. Kanerva, “Computing with High-Dimensional Vectors,” IEEE Design & Test, vol. 36, no. 3, pp. 7–14, 2019.
P. Kanerva, “Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors,” Cognitive Computation, vol. 1, no. 2, pp. 139–159, 2009.
Please also check the page with videos. There are many recordings of great introductory lectures by P. Kanerva. Video is a high-level overview of HD/VSA implemented in hardware.
If you would like to look for papers within a specific direction of HD/VSA research consider searching the collection of publications.
Is there a survey of the area?
Yes, there is a comprehensive two-part survey that extensively covers most of the aspects of HD/VSA:
D. Kleyko, D. A. Rachkovskij, E. Osipov, and A. Rahimi, “A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part I: Models and Data Transformations,” ACM Computing Surveys, vol. 55, no. 6, pp. 1-40, 2022.
D. Kleyko, D. A. Rachkovskij, E. Osipov, and A. Rahimi, “A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part II: Applications, Cognitive Models, and Challenges,” ACM Computing Surveys, vol. 55, no. 9, pp. 1-52, 2023.
What type of data structures can be represented by HD/VSA?
HD/VSA can represent and query an impressive range of data structures. Examples of such structures are key-value pairs also known as role-filler bindings, sets, histograms, sequences, graphs, trees, stacks, state automata, and others. There is a comprehensive write-up, which introduces transformation of a plethora of data structures into high-dimensional vectors. Informally, we refer to to this write-up as “cookbook”:
D. Kleyko, M. Davies, E. P. Frady, P. Kanerva, S. J. Kent, B. A. Olshausen, E. Osipov, J. M. Rabaey, D. A. Rachkovskij, A. Rahimi, F. T. Sommer, “Vector Symbolic Architectures as a Computing Framework for Emerging Hardware,” Proceedings of the IEEE, vol. 110, no. 10, pp. 1538-1571, 2022.
What is the information capacity of HD/VSA representations?
It is a very common practice to ask about the information capacity of distributed representations used in HD/VSA representations. Some early results in this direction were already obtained in [1] when studying Holographic Reduced Representations model. Later, some results for Matrix Binding of Additive Terms and Binary Spatter Codes models were presented in [2] and [3], respectively. The most general and comprehensive analysis of the information capacity of different HD/VSA models (and also some classes of recurrent neural networks) has been provided in [4] and exnteded further in [5]. So we highly recommend studying the theory in [4] to get a strong grasp on information capacity of HD/VSA models.
T. A. Plate, “Distributed Representations and Nested Compositional Structure,” Doctoral Thesis, University of Toronto, p. 202, 1994.
S. I. Gallant, T. W. Okaywe, “Representing Objects, Relations, and Sequences,” Neural Computation, vol. 25, no. 8, pp. 2038-2078, 2013.
D. Kleyko, E. Osipov, A. Senior, A. I. Khan, Y. A. Şekerciogğlu, “Holographic Graph Neuron: A Bioinspired Architecture for Pattern Processing,” IEEE Transactions on Neural Networks and Learning Systems, vol. 28, no. 6, pp. 1250-1262, 2017.
E. P. Frady, D. Kleyko, F. T. Sommer, “A Theory of Sequence Indexing and Working Memory in Recurrent Neural Networks,” Neural Computation, vol. 30, no. 6, pp. 1449-1513, 2018.
D. Kleyko, A. Rosato, E. P. Frady, M. Panella, F. T. Sommer, “Perceptron Theory Can Predict the Accuracy of Neural Networks,” IEEE Transactions on Neural Networks and Learning Systems, pp. 1-15, 2023.
What is the "Blessing of Dimensionality"?
This is the term, which is often used to name phenomena opposite to the well-known Curse of Dimensionality. Different areas of mathematics and computer science assign different meanings to the Blessing of Dimensionality.
In HD/VSA, the Blessing of Dimensionality manifistates itself in the increased capacity [1] of distributed representations for vectors of higher dimensionality.
In Stochastic computing, the Blessing of Dimensionality manifistates itself in the progressive precision [2] of computations for vectors of higher dimensionality.
In Compressed sensing, the Blessing of Dimensionality manifistates itself in the improved quality of the reconstructed signal [3] for the projection matrices with larger number of dimensions.
In Concentration of measure, the Blessing of Dimensionality manifistates itself in stochastic separation theorems [4], which suggest that “If the dimension n of the underlying topological vector space is large, then random finite but exponentially large in n samples are linearly separable, with high probability, for a range of practically relevant classes of distributions”.
Literature:
E. P. Frady, D. Kleyko, F. T. Sommer, “A Theory of Sequence Indexing and Working Memory in Recurrent Neural Networks,” Neural Computation, vol. 30, no. 6, pp. 1449-1513, 2018.
A. Alaghi, J. P. Hayes, “Computing with Randomness,” IEEE Spectrum, vol. 55, no. 3, pp. 46-51, 2018.
D. L. Donoho, “Compressed Sensing,” IEEE Transactions on Information Theory, vol. 52, no. 4, pp. 1289-1306, 2006.
A. N. Gorban, I. Y. Tyukin, “Blessing of Dimensionality: Mathematical Foundations of the Statistical Physics of Data,” Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, vol. 376, no. 2118, pp. 1-18, 2018.