lingo.lol is one of the many independent Mastodon servers you can use to participate in the fediverse.
A place for linguists, philologists, and other lovers of languages.

Server stats:

54
active users

#ise2025

0 posts0 participants0 posts today
Harald Sack<p>The <a href="https://sigmoid.social/tags/ISE2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ISE2025</span></a> lecture is over. Today our students will write the final exam. FIngers crossed. Meanwhile - as our favorite coffee place is on vacation - Cappuccino and Zupfkuchen (highly recommended) together with the <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fizise</span></a></span> TA team at Intro Café :) </p><p><a href="https://sigmoid.social/tags/coffeechallenge" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>coffeechallenge</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lecture</span></a> <a href="https://sigmoid.social/tags/academiclife" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>academiclife</span></a> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@GenAsefa" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>GenAsefa</span></a></span> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://social.kit.edu/@KIT_Karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>KIT_Karlsruhe</span></a></span></p>
Harald Sack<p>In our last <a href="https://sigmoid.social/tags/ISE2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ISE2025</span></a> lecture last week, we were discussing what makes a node "important" in a knowledge graph. A simple heuristics can be borrowed from graph theory or communication theory: Degree Centrality</p><p>Interestingly, in Wikidata In-degree centrality states Jane Austen to be to most "important" female author, while Out-degree centrality claims J.K. Rowling as being more "important" ;-) </p><p><a href="https://sigmoid.social/tags/knowledgegraphs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>knowledgegraphs</span></a> <a href="https://sigmoid.social/tags/semanticweb" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>semanticweb</span></a> <a href="https://sigmoid.social/tags/graphtheory" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>graphtheory</span></a> <a href="https://sigmoid.social/tags/feminism" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>feminism</span></a> <a href="https://sigmoid.social/tags/eyeofthebeholder" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>eyeofthebeholder</span></a> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span></p>
Harald Sack<p>One of our final topics in the <a href="https://sigmoid.social/tags/ISE2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ISE2025</span></a> lecture were Knowledge Graph Embeddings. How to vectorise KG structures while preserving their inherent semantics? </p><p><a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://sigmoid.social/tags/KGE" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>KGE</span></a> <a href="https://sigmoid.social/tags/knowledgegraphs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>knowledgegraphs</span></a> <a href="https://sigmoid.social/tags/embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>embeddings</span></a> <a href="https://sigmoid.social/tags/semanticweb" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>semanticweb</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lecture</span></a> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fiz_karlsruhe</span></a></span></p>
Harald Sack<p>This week's ISE 2025 lecture was focussed on artificial neural networks. In particular, we were discussing how to get rid of manual feature engineering and doing representation learning from raw data with convolutional neural networks.</p><p><a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://sigmoid.social/tags/ArtificialNeuralNetworks" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ArtificialNeuralNetworks</span></a> <a href="https://sigmoid.social/tags/cnn" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>cnn</span></a> <a href="https://sigmoid.social/tags/deeplearning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>deeplearning</span></a> <a href="https://sigmoid.social/tags/machinelearning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>machinelearning</span></a> <a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ise2025</span></a> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://mastodon.social/@sarahjamielewis" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sarahjamielewis</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span></p>
Harald Sack<p>In the ISE2025 lecture today, our students learned about unsupervised learning on the example of k-Means clustering. One nice hands-on example is image colour reduction based on k-means clustering, as demonstrated in a colab notebook (based on the Python DataScience Handbook by Vanderplus)</p><p>colab notebook: <a href="https://colab.research.google.com/drive/1lhdq2pynuwJKoXbspydECuWcPRw3-xxn?usp=sharing" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">colab.research.google.com/driv</span><span class="invisible">e/1lhdq2pynuwJKoXbspydECuWcPRw3-xxn?usp=sharing</span></a><br>Python DataScience Handbook: <a href="https://archive.org/details/python-data-science-handbook.pdf/mode/2up" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">archive.org/details/python-dat</span><span class="invisible">a-science-handbook.pdf/mode/2up</span></a></p><p><a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lecture</span></a> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span> <a href="https://sigmoid.social/tags/datascience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>datascience</span></a> <a href="https://sigmoid.social/tags/machinelearning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>machinelearning</span></a> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <span class="h-card" translate="no"><a href="https://social.kit.edu/@KIT_Karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>KIT_Karlsruhe</span></a></span></p>
Harald Sack<p>Tomorrow, we will dive deeper into ontologies with OWL, the Web Ontology Language. However, I'm doing OWL-lectures now for almost 20 years - and OWL as well as the lecture haven't changed much. So, I'm afraid I'm going to surprise/dissapoint the students tomorrow, when I will switch off the presentation and start improvising a random OWL ontology with them on the blackboard ;-) </p><p><a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/OWL" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OWL</span></a> <a href="https://sigmoid.social/tags/semanticweb" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>semanticweb</span></a> <a href="https://sigmoid.social/tags/semweb" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>semweb</span></a> <a href="https://sigmoid.social/tags/RDF" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>RDF</span></a> <a href="https://sigmoid.social/tags/knowledgegraphs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>knowledgegraphs</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span></p>
Sarven Capadisli<p><span class="h-card" translate="no"><a href="https://sigmoid.social/@lysander07" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>lysander07</span></a></span> What I like about this view is that info is not necessarily complete. Typical real-world data. So, some queries will not show everything there is to know about a topic while a lot of info may be for other things. Though it can be extended - "pay as you go".</p><p>Some students may also be interested in expressing their research output as a <a href="https://w3c.social/tags/KnowledgeGraph" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>KnowledgeGraph</span></a> on its own right - playing an important role in <a href="https://w3c.social/tags/ScholarlyCommunication" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ScholarlyCommunication</span></a></p><p>eg view: <a href="https://dokie.li/?graph=https://csarven.ca/linked-research-decentralised-web" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">dokie.li/?graph=https://csarve</span><span class="invisible">n.ca/linked-research-decentralised-web</span></a></p><p>seeAlso alt text.</p><p><a href="https://w3c.social/tags/ISE2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ISE2025</span></a></p>
Harald Sack<p>In today's ISE 2025 lecture,, we will introduce SPARQL as a query language for knowledge graphs. Again, I'm trying out 'Dystopian Novels' as example knowledge graph playground. Let's see, if the students might know any of them. Wtat do you think? ;-) </p><p><a href="https://sigmoid.social/tags/dystopia" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>dystopia</span></a> <a href="https://sigmoid.social/tags/literature" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>literature</span></a> <a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/semanticweb" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>semanticweb</span></a> <a href="https://sigmoid.social/tags/semweb" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>semweb</span></a> <a href="https://sigmoid.social/tags/knowledgegraphs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>knowledgegraphs</span></a> <a href="https://sigmoid.social/tags/sparql" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>sparql</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lecture</span></a> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span></p>
Harald Sack<p>Back in the lecture hall again after two exciting weeks of <a href="https://sigmoid.social/tags/ESWC2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ESWC2025</span></a> and <a href="https://sigmoid.social/tags/ISWS2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ISWS2025</span></a>. This morning, we introduced our students to RDF, RDFS, RDF Inferencing, and RDF Reification.</p><p><a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/semanticweb" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>semanticweb</span></a> <a href="https://sigmoid.social/tags/semweb" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>semweb</span></a> <a href="https://sigmoid.social/tags/knowledgegraphs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>knowledgegraphs</span></a> <a href="https://sigmoid.social/tags/rdf" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>rdf</span></a> <a href="https://sigmoid.social/tags/reasoning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>reasoning</span></a> <a href="https://sigmoid.social/tags/reification" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>reification</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lecture</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://mastodon.social/@KIT_Karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>KIT_Karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span></p>
Harald Sack<p>Last week, we continued our <a href="https://sigmoid.social/tags/ISE2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ISE2025</span></a> lecture on distributional semantics with the introduction of neural language models (NLMs) and compared them to traditional statistical n-gram models. <br>Benefits of NLMs:<br>- Capturing Long-Range Dependencies<br>- Computational and Statistical Tractability<br>- Improved Generalisation<br>- Higher Accuracy</p><p><span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span> <a href="https://sigmoid.social/tags/llms" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>llms</span></a> <a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lecture</span></a></p>
Harald Sack<p>In the <a href="https://sigmoid.social/tags/ISE2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ISE2025</span></a> lecture today we were introducing our students to the concept of distributional semantics as the foundation of modern large language models. Historically, Wittgenstein was one of the important figures in the Philosophy of Language stating thet "The meaning of a word is its use in the language."</p><p><a href="https://static1.squarespace.com/static/54889e73e4b0a2c1f9891289/t/564b61a4e4b04eca59c4d232/1447780772744/Ludwig.Wittgenstein.-.Philosophical.Investigations.pdf" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">static1.squarespace.com/static</span><span class="invisible">/54889e73e4b0a2c1f9891289/t/564b61a4e4b04eca59c4d232/1447780772744/Ludwig.Wittgenstein.-.Philosophical.Investigations.pdf</span></a></p><p><a href="https://sigmoid.social/tags/philosophy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>philosophy</span></a> <a href="https://sigmoid.social/tags/wittgenstein" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>wittgenstein</span></a> <a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://sigmoid.social/tags/llm" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>llm</span></a> <a href="https://sigmoid.social/tags/languagemodel" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>languagemodel</span></a> <a href="https://sigmoid.social/tags/language" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>language</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lecture</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span> <a href="https://sigmoid.social/tags/AIart" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIart</span></a></p>
Harald Sack<p>Generating Shakespeare-like text with an n-gram language model is straight forward and quite simple. But, don't expect to much of it. It will not be able to recreate a lost Shakespear play for you ;-) It's merely a parrot, making up well sounding sentences out of fragments of original Shakespeare texts...</p><p><a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lecture</span></a> <a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/llm" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>llm</span></a> <a href="https://sigmoid.social/tags/languagemodel" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>languagemodel</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span> <a href="https://sigmoid.social/tags/shakespeare" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>shakespeare</span></a> <a href="https://sigmoid.social/tags/generativeAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>generativeAI</span></a> <a href="https://sigmoid.social/tags/statistics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>statistics</span></a></p>
Harald Sack<p>In our <a href="https://sigmoid.social/tags/ISE2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ISE2025</span></a> lecture last Wednesday, we learned how in n-gram language models via Markov assumption and maximum likelihood estimation we can predict the probability of the occurrence of a word given a specific context (i.e. n words previous in the sequence of words).</p><p><a href="https://sigmoid.social/tags/NLP" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>NLP</span></a> <a href="https://sigmoid.social/tags/languagemodels" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>languagemodels</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lecture</span></a> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fizise</span></a></span> @tabea <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://mastodon.social/@KIT_Karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>KIT_Karlsruhe</span></a></span></p>
Harald Sack<p>This week, we were discussing the central question Can we "predict" a word? as the basis for statistical language models in our <a href="https://sigmoid.social/tags/ISE2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ISE2025</span></a> lecture. Of course, I wasx trying Shakespeare quotes to motivate the (international) students to complement the quotes with "predicted" missing words ;-)</p><p>"All the world's a stage, and all the men and women merely...."</p><p><a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/llms" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>llms</span></a> <a href="https://sigmoid.social/tags/languagemodel" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>languagemodel</span></a> <a href="https://sigmoid.social/tags/Shakespeare" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Shakespeare</span></a> <a href="https://sigmoid.social/tags/AIart" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIart</span></a> lecture <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span> <a href="https://sigmoid.social/tags/brushUpYourShakespeare" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>brushUpYourShakespeare</span></a></p>
Harald Sack<p>Last week, our students learned how to conduct a proper evaluation for an NLP experiment. To this end, we introduced a small textcorpus with sentences about Joseph Fourier, who counts as one of the discoverers of the greenhouse effect, responsible for global warming.</p><p><a href="https://github.com/ISE-FIZKarlsruhe/ISE-teaching/blob/b72690d38911b37748082256b61f96cf86171ace/materials/dataset/fouriercorpus.txt" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">github.com/ISE-FIZKarlsruhe/IS</span><span class="invisible">E-teaching/blob/b72690d38911b37748082256b61f96cf86171ace/materials/dataset/fouriercorpus.txt</span></a></p><p><a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lecture</span></a> <a href="https://sigmoid.social/tags/climatechange" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>climatechange</span></a> <a href="https://sigmoid.social/tags/globalwarming" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>globalwarming</span></a> <a href="https://sigmoid.social/tags/historyofscience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>historyofscience</span></a> <a href="https://sigmoid.social/tags/climate" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>climate</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span></p>
Harald Sack<p>Last leg on our brief history of NLP (so far) is the advent of large language models with GPT-3 in 2020 and the introduction of learning from the prompt (aka few-shot learning).</p><p>T. B. Brown et al. (2020). Language models are few-shot learners. NIPS'20</p><p><a href="https://proceedings.neurips.cc/paper/2020/file/1457c0d6bfcb4967418bfb8ac142f64a-Paper.pdf" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">proceedings.neurips.cc/paper/2</span><span class="invisible">020/file/1457c0d6bfcb4967418bfb8ac142f64a-Paper.pdf</span></a></p><p><a href="https://sigmoid.social/tags/llms" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>llms</span></a> <a href="https://sigmoid.social/tags/gpt" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>gpt</span></a> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/historyofscience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>historyofscience</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span> <a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ise2025</span></a></p>
Harald Sack<p>Next stop in our NLP timeline is 2013, the introduction of low dimensional dense word vectors - so-called "word embeddings" - based on distributed semantics, as e.g. word2vec by Mikolov et al. from Google, which enabled representation learning on text.</p><p>T. Mikolov et al. (2013). Efficient Estimation of Word Representations in Vector Space. <br><a href="https://arxiv.org/abs/1301.3781" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">arxiv.org/abs/1301.3781</span><span class="invisible"></span></a></p><p><a href="https://sigmoid.social/tags/NLP" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>NLP</span></a> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://sigmoid.social/tags/wordembeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>wordembeddings</span></a> <a href="https://sigmoid.social/tags/word2vec" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>word2vec</span></a> <a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/historyofscience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>historyofscience</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span></p>
Harald Sack<p>Building on the 90s, statistical n-gram language models, trained on vast text collections, became the backbone of NLP research. They fueled advancements in nearly all NLP techniques of the era, laying the groundwork for today's AI. </p><p>F. Jelinek (1997), Statistical Methods for Speech Recognition, MIT Press, Cambridge, MA</p><p><a href="https://sigmoid.social/tags/NLP" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>NLP</span></a> <a href="https://sigmoid.social/tags/LanguageModels" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LanguageModels</span></a> <a href="https://sigmoid.social/tags/HistoryOfAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HistoryOfAI</span></a> <a href="https://sigmoid.social/tags/TextProcessing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TextProcessing</span></a> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://sigmoid.social/tags/historyofscience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>historyofscience</span></a> <a href="https://sigmoid.social/tags/ISE2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ISE2025</span></a> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span></p>
Harald Sack<p>Next stop on our NLP timeline (as part of the <a href="https://sigmoid.social/tags/ISE2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ISE2025</span></a> lecture) was Terry Winograd's SHRDLU, an early natural language understanding system developed in 1968-70 that could manipulate blocks in a virtual world. </p><p>Winograd, T. Procedures as a Representation for Data in a Computer Program for Understanding Natural Language. MIT AI Technical Report 235.<br><a href="http://dspace.mit.edu/bitstream/handle/1721.1/7095/AITR-235.pdf" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">dspace.mit.edu/bitstream/handl</span><span class="invisible">e/1721.1/7095/AITR-235.pdf</span></a></p><p><a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lecture</span></a> <a href="https://sigmoid.social/tags/historyofscience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>historyofscience</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a></p>
Harald Sack<p>With the advent of ELIZA, Joseph Weizenbaum's first psychotherapist chatbot, NLP took another major step with pattern-based substitution algorithms based on simple regular expressions.</p><p>Weizenbaum, Joseph (1966). ELIZA—a computer program for the study of natural language communication between man and machine. Com. of the ACM. 9: 36–45.</p><p><a href="https://dl.acm.org/doi/pdf/10.1145/365153.365168" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">dl.acm.org/doi/pdf/10.1145/365</span><span class="invisible">153.365168</span></a></p><p><a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lecture</span></a> <a href="https://sigmoid.social/tags/chatbot" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>chatbot</span></a> <a href="https://sigmoid.social/tags/llm" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>llm</span></a> <a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/historyofScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>historyofScience</span></a> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>sourisnumerique</span></a></span></p>