lingo.lol is one of the many independent Mastodon servers you can use to participate in the fediverse.
A place for linguists, philologists, and other lovers of languages.

Server stats:

66
active users

#hyperdimensionalcomputing

0 posts0 participants0 posts today
Aaron<p>I get really sick of hearing about OpenAI and the rest of the LLM crowd making absurd claims about LLMs like ChatGPT being AGI. LLMs are not even close to AGI. If you want to see real research into AGI, watch this video on Chris Eliasmith's Spaun (semantic pointer architecture unified network). It may not look as snazzy and consumer-ready as all these hallucinating LLMs being marketed to us, but this is the real deal: A single system that can do lots of different things the brain does and that LLMs can never do.</p><p><a href="https://youtu.be/I5h-xjddzlY?si=a-dS3yUPtioypa7P" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">youtu.be/I5h-xjddzlY?si=a-dS3y</span><span class="invisible">UPtioypa7P</span></a></p><p><a href="https://techhub.social/tags/Spaun" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Spaun</span></a><br><a href="https://techhub.social/tags/ArtificialGeneralIntelligence" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ArtificialGeneralIntelligence</span></a> <a href="https://techhub.social/tags/AGI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AGI</span></a> <a href="https://techhub.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://techhub.social/tags/ML" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ML</span></a><br><a href="https://techhub.social/tags/SemanticPointerArchitecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SemanticPointerArchitecture</span></a> <a href="https://techhub.social/tags/SPA" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SPA</span></a><br><a href="https://techhub.social/tags/Nengo" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Nengo</span></a><br><a href="https://techhub.social/tags/Hypervectors" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Hypervectors</span></a><br><a href="https://techhub.social/tags/VectorSymbolicArchitecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VectorSymbolicArchitecture</span></a> <a href="https://techhub.social/tags/VSA" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VSA</span></a><br><a href="https://techhub.social/tags/HyperdimensionalComputing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HyperdimensionalComputing</span></a> <a href="https://techhub.social/tags/HDC" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HDC</span></a><br><a href="https://techhub.social/tags/HolographicReducedRepresentation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HolographicReducedRepresentation</span></a> <a href="https://techhub.social/tags/HRR" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HRR</span></a><br><a href="https://techhub.social/tags/OpenAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenAI</span></a> <a href="https://techhub.social/tags/ChatGPT" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ChatGPT</span></a> <a href="https://techhub.social/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a> <a href="https://techhub.social/tags/LLMs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLMs</span></a></p>
Aaron<p><span class="h-card" translate="no"><a href="https://yt.lostpod.space/accounts/root" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>root</span></a></span> The closest technologies we have to how the human brain works are not LLMs, but some less well-known ones: reinforcement learning algorithms and hyperdimensional computing. If you want to see what HDC is capable of, check out this video:</p><p><a href="https://youtu.be/P_WRCyNQ9KY?si=JgAuOJQmsQ6tVIiO" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">youtu.be/P_WRCyNQ9KY?si=JgAuOJ</span><span class="invisible">QmsQ6tVIiO</span></a></p><p><a href="https://techhub.social/tags/HDC" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HDC</span></a> <a href="https://techhub.social/tags/HyperdimensionalComputing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HyperdimensionalComputing</span></a><br><a href="https://techhub.social/tags/VSA" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VSA</span></a> <a href="https://techhub.social/tags/VectorSymbolicArchitecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VectorSymbolicArchitecture</span></a><br><a href="https://techhub.social/tags/HRR" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HRR</span></a> <a href="https://techhub.social/tags/HolographicReducedRepresentation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HolographicReducedRepresentation</span></a><br><a href="https://techhub.social/tags/SpikingNeuralNetworks" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SpikingNeuralNetworks</span></a><br><a href="https://techhub.social/tags/AGI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AGI</span></a> <a href="https://techhub.social/tags/ArtificialGeneralIntelligence" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ArtificialGeneralIntelligence</span></a><br><a href="https://techhub.social/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a></p>
Ross Gayler<p>The next VSAonline webinar is at 17:00 UTC (not the usual time), Monday 27 January.</p><p>Zoom: <a href="https://ltu-se.zoom.us/j/65564790287" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">ltu-se.zoom.us/j/65564790287</span><span class="invisible"></span></a> </p><p>WEB: <a href="https://bit.ly/vsaonline" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">bit.ly/vsaonline</span><span class="invisible"></span></a> </p><p>Speaker: Anthony Thomas from UC Davis, USA</p><p>Title: ”Sketching a Picture of Vector Symbolic Architectures”</p><p>Abstract : Sketching algorithms are a broad area of research in theoretical computer science and numerical analysis that aim to distil data into a simple summary, called a "sketch," that retains some essential notion of structure while being much more efficient to store, query, and transmit.</p><p>Vector-symbolic architectures (VSAs) are an approach to computing on data represented using random vectors, and provide an elegant conceptual framework for realizing a wide variety of data structures and algorithms in a way that lends itself to implementation in highly-parallel and energy-efficient computer hardware.</p><p>Sketching algorithms and VSA have a substantial degree of consonance in their methods, motivations, and applications. In this tutorial style talk, I will discuss some of the connections between these two fields, focusing, in particular, on the connections between VSA and tensor-sketches, a family of sketching algorithms concerned with the setting in which the data being sketched can be decomposed into Kronecker (tensor) products between more primitive objects. This is exactly the situation of interest in VSA and the two fields have arrived at strikingly similar solutions to this problem.</p><p><a href="https://aus.social/tags/VectorSymbolicArchitectures" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VectorSymbolicArchitectures</span></a> <a href="https://aus.social/tags/VSA" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VSA</span></a> <a href="https://aus.social/tags/HyperdimensionalComputing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HyperdimensionalComputing</span></a> <a href="https://aus.social/tags/HDC" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HDC</span></a> <a href="https://aus.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://aus.social/tags/ML" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ML</span></a> <a href="https://aus.social/tags/ComputationalCognitiveScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ComputationalCognitiveScience</span></a> <a href="https://aus.social/tags/CompCogSci" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CompCogSci</span></a> <a href="https://aus.social/tags/MathematicalPsychology" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MathematicalPsychology</span></a> <a href="https://aus.social/tags/MathPsych" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MathPsych</span></a> <a href="https://aus.social/tags/CognitiveScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CognitiveScience</span></a> <a href="https://aus.social/tags/CogSci" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CogSci</span></a> <span class="h-card" translate="no"><a href="https://a.gup.pe/u/cogsci" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>cogsci</span></a></span></p>
Ross Gayler<p>The schedule for the next VSAonline webinar series (Januray to June 2025) is published at:</p><p><a href="https://sites.google.com/view/hdvsaonline/spring-2025" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">sites.google.com/view/hdvsaonl</span><span class="invisible">ine/spring-2025</span></a></p><p>There are 11 talks around <a href="https://aus.social/tags/VectorSymbolicArchitecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VectorSymbolicArchitecture</span></a> / <a href="https://aus.social/tags/HyperdimensionalComputing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HyperdimensionalComputing</span></a> </p><p>The talks are (almost always) recorded and published online, in case you can't participate in the live session.</p><p><span class="h-card" translate="no"><a href="https://a.gup.pe/u/cogsci" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>cogsci</span></a></span> <br><a href="https://aus.social/tags/VSA" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VSA</span></a> <a href="https://aus.social/tags/HDC" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HDC</span></a> <a href="https://aus.social/tags/CompCogScii" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CompCogScii</span></a> <a href="https://aus.social/tags/MathPsych" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MathPsych</span></a> <a href="https://aus.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://aus.social/tags/neuromorphic" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>neuromorphic</span></a> <a href="https://aus.social/tags/neurosymbolic" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>neurosymbolic</span></a> <a href="https://aus.social/tags/ComputationalNeuroscience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ComputationalNeuroscience</span></a> <a href="https://aus.social/tags/ComputationalCognitiveScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ComputationalCognitiveScience</span></a> <a href="https://aus.social/tags/MathematicalPsychology" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MathematicalPsychology</span></a></p>
Ross GaylerVector Symbolic Architecture / Hyperdimensional Computing workshop - Invited speaker announcement and call for posters
Ross Gayler<p>Here's a really interesting (long) paper on what a theory of computing based on arbitrary physical substrates might look like: <a href="http://arxiv.org/abs/2307.15408" rel="nofollow noopener" target="_blank"><span class="invisible">http://</span><span class="">arxiv.org/abs/2307.15408</span><span class="invisible"></span></a></p><p>"Toward a formal theory for computing machines made out of whatever physics offers: extended version"</p><p>Herbert Jaeger, Beatriz Noheda, Wilfred G. van der Wiel (2023)</p><p><span class="h-card"><a href="https://mas.to/@bnoheda" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>bnoheda</span></a></span> </p><p><a href="https://aus.social/tags/NewPaper" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>NewPaper</span></a> <a href="https://aus.social/tags/TheoreticalComputerScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TheoreticalComputerScience</span></a> <a href="https://aus.social/tags/neuromorphic" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>neuromorphic</span></a> <a href="https://aus.social/tags/CogSci" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CogSci</span></a> <a href="https://aus.social/tags/CognitiveScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CognitiveScience</span></a> <a href="https://aus.social/tags/VSA" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VSA</span></a> <a href="https://aus.social/tags/VectorSymbolicArchitecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VectorSymbolicArchitecture</span></a> <a href="https://aus.social/tags/HDC" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HDC</span></a> <a href="https://aus.social/tags/HyperdimensionalComputing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HyperdimensionalComputing</span></a> <a href="https://aus.social/tags/AnalogComputing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AnalogComputing</span></a></p>
Ross Gayler<p>The videos of the talks from the Midnight Sun Workshop on Vector Symbolic Architectures and Hyperdimensional Computing are now online at:</p><p><a href="https://sites.google.com/ltu.se/midnightvsa/abstracts?authuser=0" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="ellipsis">sites.google.com/ltu.se/midnig</span><span class="invisible">htvsa/abstracts?authuser=0</span></a></p><p><a href="https://aus.social/tags/VSA" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VSA</span></a> <a href="https://aus.social/tags/VectorSymbolicArchitecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VectorSymbolicArchitecture</span></a> <a href="https://aus.social/tags/HDC" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HDC</span></a> <a href="https://aus.social/tags/HyperdimensionalComputing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HyperdimensionalComputing</span></a> <a href="https://aus.social/tags/CogSci" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CogSci</span></a> <a href="https://aus.social/tags/CognitiveScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CognitiveScience</span></a> <a href="https://aus.social/tags/neuromorphic" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>neuromorphic</span></a> <a href="https://aus.social/tags/UnconventionalComputing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>UnconventionalComputing</span></a></p>
Ross Gayler<p>Thanks <span class="h-card"><a href="https://techhub.social/@hosford42" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>hosford42</span></a></span> for reminding me of this half-day tutorial on Vector Symbolic Architectures / Hyperdimensional Computing. The authors have been applying HDC/VSA to place recognition in robotics, but the tutorial coverage is much wider.</p><p><a href="https://www.tu-chemnitz.de/etit/proaut/workshops_tutorials/hdc_ki19/index.html" rel="nofollow noopener" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">tu-chemnitz.de/etit/proaut/wor</span><span class="invisible">kshops_tutorials/hdc_ki19/index.html</span></a></p><p><a href="https://aus.social/tags/VSA" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VSA</span></a> <a href="https://aus.social/tags/VectorSymbolicArchitecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VectorSymbolicArchitecture</span></a> <a href="https://aus.social/tags/HDC" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HDC</span></a> <a href="https://aus.social/tags/HyperdimensionalComputing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HyperdimensionalComputing</span></a> <a href="https://aus.social/tags/CogRob" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CogRob</span></a> <a href="https://aus.social/tags/CognitiveRobotics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CognitiveRobotics</span></a> <a href="https://aus.social/tags/CompCogSci" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CompCogSci</span></a> <a href="https://aus.social/tags/ComputationalCognitiveScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ComputationalCognitiveScience</span></a> <a href="https://aus.social/tags/CogSci" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CogSci</span></a> <a href="https://aus.social/tags/CognitiveScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CognitiveScience</span></a> <a href="https://aus.social/tags/MathPsych" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MathPsych</span></a> <a href="https://aus.social/tags/MathematicalPsychology" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MathematicalPsychology</span></a></p>
Ross Gayler<p><span class="h-card"><a href="https://techhub.social/@hosford42" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>hosford42</span></a></span> I'd be interested to see a description of what you've done for reinforcement learning with hyperdimensional computing.</p><p><a href="https://aus.social/tags/HDC" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HDC</span></a> <a href="https://aus.social/tags/HyperdimensionalComputing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HyperdimensionalComputing</span></a> <a href="https://aus.social/tags/VSA" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VSA</span></a> <a href="https://aus.social/tags/VectorSymbolicArchitecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VectorSymbolicArchitecture</span></a></p>
Aaron<p>Over the weekend I threw together some <a href="https://techhub.social/tags/Python" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Python</span></a> <a href="https://techhub.social/tags/TensorFlow" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TensorFlow</span></a> code to use <a href="https://techhub.social/tags/HyperDimensionalComputing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HyperDimensionalComputing</span></a> for <a href="https://techhub.social/tags/ReinforcementLearning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ReinforcementLearning</span></a>. I was surprised to see my naive, first attempt blow away <a href="https://techhub.social/tags/DeepLearning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DeepLearning</span></a>. I'm hooked.</p><p>Where have you been all my life, <a href="https://techhub.social/tags/HDC" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HDC</span></a>?</p>
Aaron<p>Is anybody out there aware of, or better yet, familiar with, <a href="https://techhub.social/tags/HyperDimensionalComputing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HyperDimensionalComputing</span></a>? I'm eager to learn more about it.</p><p><a href="https://techhub.social/tags/MachineLearning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MachineLearning</span></a></p>
Ross Gayler<p>High level piece on hyperdimensional computing / Vector Symbolic Architecture in Quanta Magazine:<br><a href="https://www.quantamagazine.org/a-new-approach-to-computation-reimagines-artificial-intelligence-20230413/" rel="nofollow noopener" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">quantamagazine.org/a-new-appro</span><span class="invisible">ach-to-computation-reimagines-artificial-intelligence-20230413/</span></a></p><p><a href="https://aus.social/tags/VSA" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VSA</span></a> <a href="https://aus.social/tags/VectorSymbolicArchitecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VectorSymbolicArchitecture</span></a> <a href="https://aus.social/tags/HDC" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HDC</span></a> <a href="https://aus.social/tags/HyperdimensionalComputing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HyperdimensionalComputing</span></a> <a href="https://aus.social/tags/CompCogSci" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CompCogSci</span></a> <a href="https://aus.social/tags/CogSci" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CogSci</span></a> <a href="https://aus.social/tags/CognitiveScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CognitiveScience</span></a></p>
Joaquín Herrero :wiki:<p>"<a href="https://scholar.social/tags/HyperdimensionalComputing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HyperdimensionalComputing</span></a>: Each piece of information is represented as a single entity, a hyperdimensional vector. An advantage of HC is transparency: The algebra clearly tells you why the system chose the answer it did. The same is not true for traditional neural networks. HC is well suited for low-power hardware. Despite such advantages, HC is still in its infancy but there’s real potential here."</p><p>A New Approach to Computation Reimagines Artificial Intelligence<br><a href="https://www.quantamagazine.org/a-new-approach-to-computation-reimagines-artificial-intelligence-20230413/" rel="nofollow noopener" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">quantamagazine.org/a-new-appro</span><span class="invisible">ach-to-computation-reimagines-artificial-intelligence-20230413/</span></a></p><p><a href="https://scholar.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a></p>
Ross Gayler<p><span class="h-card"><a href="https://fediscience.org/@DrYohanJohn" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>DrYohanJohn</span></a></span> <br><span class="h-card"><a href="https://a.gup.pe/u/cogsci" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>cogsci</span></a></span> </p><p>Boosting for my followers - see original post in the thread above.</p><p>The BBS target article on the Language of Thought Hypothesis states:</p><p>"We outline six core properties of<br>LoTs: (i) discrete constituents; (ii) role-filler independence; (iii) predicate-argument<br>structure; (iv) logical operators; (v) inferential promiscuity; and (vi) abstract content."</p><p>It's interesting to think about the extent to which those properties are directly enabled by neural representations using Vector Symbolic Architectures / Hyperdimensional Computing. It hadn't occurred to me to draw a line between LoT and VSA/HDC.</p><p><a href="https://aus.social/tags/CogSci" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CogSci</span></a> <a href="https://aus.social/tags/CognitiveScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CognitiveScience</span></a> <a href="https://aus.social/tags/CompCogSci" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CompCogSci</span></a> <a href="https://aus.social/tags/ComputationalCognitiveScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ComputationalCognitiveScience</span></a> <a href="https://aus.social/tags/VSA" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VSA</span></a> <a href="https://aus.social/tags/VectorSymbolicArchitecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VectorSymbolicArchitecture</span></a> <a href="https://aus.social/tags/HDC" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HDC</span></a> <a href="https://aus.social/tags/HyperdimensionalComputing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HyperdimensionalComputing</span></a></p>
Ross GaylerA blog post on basic Vector Symbolic Architecture / Hyperdimensional Computing operations implemented in Clojure
Ross Gayler<p><span class="h-card"><a href="https://fediscience.org/@matspike" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>matspike</span></a></span> Focusing on your final couple of points: linguistics needs structural representation &amp; PDEs (continuous maths models) offer nothing for structure - You might want to take a look at Hyperdimensional Computing (HDC) / Vector Symbolic Architectures (VSA), which are all about how to represent and manipulate composite structures (trees, graphs, etc.) by simple arithmetic on high dimensional vector spaces (as a model of, say, the firing rates of a large group of neurons). Mostly it's algebraic, but PDEs are optional if you wanted to model the dynamics of a VSA/HDC system.</p><p>Early VSA paper referring to the Jackendoff example you use: <a href="https://arxiv.org/abs/cs/0412059" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="">arxiv.org/abs/cs/0412059</span><span class="invisible"></span></a></p><p>Me droning on about VSA as analogue computing for discrete structures: <a href="https://archive.org/details/Redwood_Center_2013_06_14_Ross_Gayler" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="ellipsis">archive.org/details/Redwood_Ce</span><span class="invisible">nter_2013_06_14_Ross_Gayler</span></a></p><p>A recent survey of the topic: <a href="https://arxiv.org/abs/2111.06077" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="">arxiv.org/abs/2111.06077</span><span class="invisible"></span></a></p><p><a href="https://aus.social/tags/VSA" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VSA</span></a> <a href="https://aus.social/tags/HDC" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HDC</span></a> <br><a href="https://aus.social/tags/VectorSymbolicArchitecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VectorSymbolicArchitecture</span></a><br><a href="https://aus.social/tags/HyperdimensionalComputing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HyperdimensionalComputing</span></a></p>