Count sketch
{{short description|Method of a dimension reduction}}
{{Machine learning bar}}
Count sketch is a type of dimensionality reduction that is particularly efficient in statistics, machine learning and algorithms.Faisal M. Algashaam; Kien Nguyen; Mohamed Alkanhal; Vinod Chandran; Wageeh Boles. "Multispectral Periocular Classification WithMultimodal Compact Multi-Linear Pooling" [1]. IEEE Access, Vol. 5. 2017.{{Cite web |last1=Ahle |first1=Thomas |last2=Knudsen |first2=Jakob |date=2019-09-03 |title=Almost Optimal Tensor Sketch |url=https://www.researchgate.net/publication/335617805 |access-date=2020-07-11 |website=ResearchGate}}
It was invented by Moses Charikar, Kevin Chen and Martin Farach-Colton{{sfn | Charikar | Chen | Farach-Colton | 2004 | pp=}} in an effort to speed up the AMS Sketch by Alon, Matias and Szegedy for approximating the frequency moments of streamsAlon, Noga, Yossi Matias, and Mario Szegedy. "The space complexity of approximating the frequency moments." Journal of Computer and system sciences 58.1 (1999): 137-147. (these calculations require counting of the number of occurrences for the distinct elements of the stream).
The sketch is nearly identical{{cn|reason=see the talk page|date=September 2023}} to the Feature hashing algorithm by John Moody,Moody, John. "Fast learning in multi-resolution hierarchies." Advances in neural information processing systems. 1989. but differs in its use of hash functions with low dependence, which makes it more practical.
In order to still have a high probability of success, the median trick is used to aggregate multiple count sketches, rather than the mean.
These properties allow use for explicit kernel methods, bilinear pooling in neural networks and is a cornerstone in many numerical linear algebra algorithms.Woodruff, David P. "Sketching as a Tool for Numerical Linear Algebra." Theoretical Computer Science 10.1-2 (2014): 1–157.
Intuitive explanation
The inventors of this data structure offer the following iterative explanation of its operation:{{sfn | Charikar | Chen | Farach-Colton | 2004 | pp=}}
- at the simplest level, the output of a single hash function {{mvar|s}} mapping stream elements {{mvar|q}} into {+1, -1} is feeding a single up/down counter {{mvar|C}}. After a single pass over the data, the frequency of a stream element {{mvar|q}} can be approximated, although extremely poorly, by the expected value ;
- a straightforward way to improve the variance of the previous estimate is to use an array of different hash functions , each connected to its own counter . For each element {{mvar|q}}, the still holds, so averaging across the {{mvar|i}} range will tighten the approximation;
- the previous construct still has a major deficiency: if a lower-frequency-but-still-important output element {{mvar|a}} exhibits a hash collision with a high-frequency element, estimate can be significantly affected. Avoiding this requires reducing the frequency of collision counter updates between any two distinct elements. This is achieved by replacing each in the previous construct with an array of {{mvar|m}} counters (making the counter set into a two-dimensional matrix ), with index {{mvar|j}} of a particular counter to be incremented/decremented selected via another set of hash functions that map element {{mvar|q}} into the range {1..{{mvar|m}}}. Since , averaging across all values of {{mvar|i}} will work.
Mathematical definition
1. For constants and (to be defined later) independently choose random hash functions
and such that
and
.
It is necessary that the hash families from which and are chosen be pairwise independent.
2. For each item in the stream, add to the th bucket of the th hash.
At the end of this process, one has sums where
:
To estimate the count of s one computes the following value:
:
The values are unbiased estimates of how many times has appeared in the stream.
The estimate has variance , where
is the length of the stream and is .Larsen, Kasper Green, Rasmus Pagh, and Jakub Tětek. "CountSketches, Feature Hashing and the Median of Three." International Conference on Machine Learning. PMLR, 2021.
Furthermore, is guaranteed to never be more than off from the true value, with probability .
=Vector formulation=
Alternatively Count-Sketch can be seen as a linear mapping with a non-linear reconstruction function.
Let , be a collection of matrices, defined by
:
for and 0 everywhere else.
Then a vector is sketched by .
To reconstruct we take .
This gives the same guarantees as stated above, if we take and .
Relation to Tensor sketch
The count sketch projection of the outer product of two vectors is equivalent to the convolution of two component count sketches.
The count sketch computes a vector convolution
, where and are independent count sketch matrices.
Pham and Pagh{{cite conference
| title = Fast and scalable polynomial kernels via explicit feature maps
| last1 = Ninh
| first1 = Pham
| first2 = Rasmus
| last2 = Pagh | author2-link = Rasmus Pagh
| date = 2013
| publisher = Association for Computing Machinery
| conference = SIGKDD international conference on Knowledge discovery and data mining
|doi = 10.1145/2487575.2487591}}
show that this equals – a count sketch of the outer product of vectors, where denotes Kronecker product.
The fast Fourier transform can be used to do fast convolution of count sketches.
By using the face-splitting product{{Cite journal|last=Slyusar|first=V. I. |title=End products in matrices in radar applications |url=http://slyusar.kiev.ua/en/IZV_1998_3.pdf|journal=Radioelectronics and Communications Systems |year=1998 |volume=41 |issue=3|pages=50–53}}{{Cite journal|last=Slyusar|first=V. I.|date=1997-05-20|title=Analytical model of the digital antenna array on a basis of face-splitting matrix products. |url=http://slyusar.kiev.ua/ICATT97.pdf|journal=Proc. ICATT-97, Kyiv|pages=108–109}}{{Cite journal|last=Slyusar|first=V. I.|date=March 13, 1998|title=A Family of Face Products of Matrices and its Properties|url=http://slyusar.kiev.ua/FACE.pdf|journal=Cybernetics and Systems Analysis C/C of Kibernetika I Sistemnyi Analiz.- 1999.|volume=35|issue=3|pages=379–384|doi=10.1007/BF02733426|s2cid=119661450 }} such structures can be computed much faster than normal matrices.
See also
- Count–min sketch is a version of algorithm with smaller memory requirements (and weaker error guarantees as a tradeoff).
- Tensor sketch
References
{{Reflist}}
Further reading
- {{cite journal | last=Charikar | first=Moses | last2=Chen | first2=Kevin | last3=Farach-Colton | first3=Martin | title=Finding frequent items in data streams | journal=Theoretical Computer Science | publisher=Elsevier BV | volume=312 | issue=1 | year=2004 | issn=0304-3975 | doi=10.1016/s0304-3975(03)00400-6 | pages=3–15 | url=https://people.cs.rutgers.edu/~farach/pubs/FrequentStream.pdf}}
- Faisal M. Algashaam; Kien Nguyen; Mohamed Alkanhal; Vinod Chandran; Wageeh Boles. "Multispectral Periocular Classification WithMultimodal Compact Multi-Linear Pooling" [https://ieeexplore.ieee.org/document/7990127]. IEEE Access, Vol. 5. 2017.
- {{Cite web |last1=Ahle |first1=Thomas |last2=Knudsen |first2=Jakob |date=2019-09-03 |title=Almost Optimal Tensor Sketch |url=https://www.researchgate.net/publication/335617805 |access-date=2020-07-11 |website=ResearchGate}}