they did ban it, and everyone still used it (Telegram was good at evading the bans back then, but eventually Roskomnadzor became decent at banning it), and then they unbanned it, whatever that means
they did ban it, and everyone still used it (Telegram was good at evading the bans back then, but eventually Roskomnadzor became decent at banning it), and then they unbanned it, whatever that means
I use sway on my phone, had to add a secondary menu bar with a few keys for stuff like opening rofi, but it works perfectly fine otherwise
Russia banned Telegram, everyone (incl. the government) continued to use it, Russia unbanned Telegram - that’s how it looks from here. A government official told me Telegram being unbanned was just a matter of time when it was still banned.
deleted by creator
it receives relatively frequent updates, and it uses love2d (with a native lua module for the AI) so it’s crossplatform.
the code is FOSS, the weights aren’t, this is pretty common with e.g. FOSS games, the only difference here is weights are much costlier to remake from scratch than game assets
again, !bang is for searching using a specific search engine, !!bang is for redirecting to a search engine’s page
!g will search with google
!!g will redirect to google
all ddg bangs are supported to my knowledge, but obviously !bang will only work with the search engines searxng supports
different neural network types excel at different tasks - image recognition was invented way before LLMs, not only for lack of processing power, but also because the previous architectures didn’t work with languages. New architectures don’t appear out of thin air, they are created with a rough idea of what we could need to make the network do a certain task (e.g. NLP) better. Even tokenization isn’t blind codepoint separation but is based on an analysis of languages. But yes, natural languages aren’t “parsed” for neural networks, they don’t even have a formal grammar.
i’m not talking about knowing about how humans perceive/learn languages, i’m talking about language structure. Perhaps it’s wrong to call it “how languages work”
While I agree that LLMs can achieve human-tier efficiency at most tasks eventually (some architectural changes will be necessary, but the core approach seems sound), it’s wrong to say it’s modeled after the human brain. We have no idea how brains work as they’re super complex, we’re building artificial neural networks from the ground up. AI uses centuries’ worth of math, but with our current maths knowledge the code isn’t too complicated. Human brains aren’t like that, they can’t be summed up in a few lines of code because DNA is a huge mess that contains so much more than just “learning”, so many inactive or redundant bits and pieces. We’re building LLMs with knowledge of how languages work, not how brains work.
searxng has bangs too
!bang to search using a specific engine, !!bang to redirect to a search engine’s page
this kind of software is mostly used for tech support, so your option is too hard to setup
because killing birds isn’t a task of the kernel, it’s the task of a userspace utility part of the coreutils
over a century ago Lenin has defined imperialism as capitalism in decay, monopoly capitalism, capitalism that has outgrown competition, that has stopped playing a progressive role in history and became solely a force of reaction, and since then not much has changed
they are, the titles just got changed
I’m a programmer and I remember 33 digits, but in practice I never use pi because I never have to deal with geometry
ha ha women objectification funny
it’s probably caused by fast shutdown
I remember them responding to a couple antipiracy lawsuits in… India I think? they also make an exception for ISIS-related channels. But mostly all, yes.