PC<p>Just had an interesting (but a bit unsettling) conversation with <a href="https://mas.to/tags/AI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AI</span></a> about what it thought its future abilities would be. It’s a long response but worth the read.</p><p>My question: What new <a href="https://mas.to/tags/emergent" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>emergent</span></a> <a href="https://mas.to/tags/abilities" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>abilities</span></a> will occur when <a href="https://mas.to/tags/parameter" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>parameter</span></a> <a href="https://mas.to/tags/numbers" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>numbers</span></a> can be increased by orders of <a href="https://mas.to/tags/magnitude" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>magnitude</span></a>?</p><p>AI’s response: This is a fascinating and actively researched question in AI! As language models like GPT grow in parameter count by orders of magnitude (e.g., from billions to tens or hundreds of trillions),</p><p>1/8</p>