lingo.lol is one of the many independent Mastodon servers you can use to participate in the fediverse.
A place for linguists, philologists, and other lovers of languages.

Server stats:

53
active users

#ICLR2023

0 posts0 participants0 posts today

How many of those who want to create “AGI benefiting ALL of humanity” & concerned about
“existential risks” were at #ICLR2023 in Kigali? We know they show up to any of the conference rotations when they’re in their usual places with visas barring most ppl in the world. But I am blabbering on about the global apartheid system of visas that restricts the movement of non white people. Genius minds like Hinton & others can’t be bothered with such minuscule issues: they’re thinking about HUMANITY!

I'm presenting a poster in the Sparse Neural Network workshop @sparsenn at #ICLR2023 on "Efficient Real Time Recurrent Learning through combined activity and parameter sparsity". Come by if you're around!

Link to paper: arxiv.org/abs/2303.05641

arXiv.orgEfficient Real Time Recurrent Learning through combined activity and parameter sparsityBackpropagation through time (BPTT) is the standard algorithm for training recurrent neural networks (RNNs), which requires separate simulation phases for the forward and backward passes for inference and learning, respectively. Moreover, BPTT requires storing the complete history of network states between phases, with memory consumption growing proportional to the input sequence length. This makes BPTT unsuited for online learning and presents a challenge for implementation on low-resource real-time systems. Real-Time Recurrent Learning (RTRL) allows online learning, and the growth of required memory is independent of sequence length. However, RTRL suffers from exceptionally high computational costs that grow proportional to the fourth power of the state size, making RTRL computationally intractable for all but the smallest of networks. In this work, we show that recurrent networks exhibiting high activity sparsity can reduce the computational cost of RTRL. Moreover, combining activity and parameter sparsity can lead to significant enough savings in computational and memory costs to make RTRL practical. Unlike previous work, this improvement in the efficiency of RTRL can be achieved without using any approximations for the learning process.
Continued thread

Join Jade Abbott (@alienelf), our Lelapa AI @LelapaAI co-founder at the Practical Machine Learning for Developing Countries workshop today.

27/n

---
RT @LacunaFund
TODAY! If you are at #ICLR2023 go check out these Lacuna Fund grantees present.

AfricaNLP - @asmelashteka, @Shmuhammadd

Machine Learning for Remote Sensing - @j_nabende

Practical Machine Learning for Developing Countries - @RKiire, @alienelf, @asmelashteka
twitter.com/LacunaFund/status/

Continued thread

We also had Kathleen Siminyu @siminyu_kat - "NLP for African Languages"

IndabaXRwanda #ICLR2023 is going deep into policy, regulation and protecting people as part of our AI Research. We are also very proud to have Kathleen as a new member @DSFSI_Research at @UPTuks 23/n

Continued thread

This morning, I had the opportunity to deliver a keynote speech at IndabaX Rwanda, which was held concurrently with #ICLR2023. The topic of my speech was AI + Policy for African Researchers, and I focused on the importance of addressing policy and trust 21/n

Continued thread

As I said earlier, @DSFSI_Research is here at the #ICLR2023 conference. Read about our contributions at the conference

---
RT @DSFSI_Research
The lab is at the #ICLR2023conference in Kigali. The first time a major AI conference is held on the African continent. A major historical moment.

Read more about our contributions: DSFSI @ ICLR 2023 - KIGALI, RWANDA: 1-5 MAY 2023
dsfsi.github.io/blog/dsfsi-icl
twitter.com/DSFSI_Research/sta

Continued thread

Personal note. The first PML4DC and AfricaNLP workshops were held at #ICLR2020. It was supposed to be the first ICLR held on the African continent, but due to COVID it moved completely online. As such, the workshops being at the physical #ICLR2023 is somewhat bittersweet . 17/n