Someone else already pointed out that you confused different women for the same person, but the man is different too. The top picture in OP’s post is Donald Trump, the USA guy. In a closer photo he looks completely different to Putin.
- 0 Posts
- 227 Comments
Aria@lemmygrad.mlto Technology@lemmy.ml•Apple calls for changes to anti-monopoly laws and says it may stop shipping to the EU5·3 days agodemands for interoperability with non-Apple products
the rules were not applied to Samsung
I wonder if Samsung owns a walled garden where their products aren’t interoperable with those from other manufacturers.
Aria@lemmygrad.mlto Privacy@lemmy.ml•Charlie Kirk Assassination Sparks Social Media Crackdown3·11 days agoDiscord is private social media, and most users think their messages aren’t shared with government without a warrant. Of course it’s possible in this case Discord didn’t even provide the messages “to police”, but that the police just subscribe to their commercial data-sharing and enjoy the same real-time access that way.
Are you calling the people on the receiving end of the genocide “genocidal”? Is your argument that if they just committed suicide, the pitiable Nazis wouldn’t need to go through the bore of conducting the genocide, so their refusal equates to demanding a genocide?
Beyond even that, they praise inviting people to negotiate and then murdering those negotiators.
It’s not an oxymoron, the idea is that when there are forces with opposed interests, one has to win. Note that this is talking about opposed interests, not interests that are merely in conflict.
So no matter how much you try to make concessions for the other, you have to choose if you want a bourgeois dictatorship (liberal democracy) or a proletariat dictatorship (people’s democracy) at the end of the day. Socialists just use less euphemism, and therefore accused of “admitting to dictatorship”, but a liberal democracy is the exact same type of dictatorship. The bourgeoisie interests dictate, and they make concessions for the sake of the proletariat.
Can you point to any of CIA’s metainfo about this file?
I believe this is the page you’re looking for. It’s very minimal. https://www.cia.gov/readingroom/document/cia-rdp80-00810a006000360009-0
Aria@lemmygrad.mlto Technology@lemmy.ml•China Reportedly Advances to 5nm AI Chips as Domestic Firms Tape Out Two New Solutions For Model Training & AI PC Workloads2·21 days agoDon’t you have that backwards? Without TSMC’s outstanding technology, the island’s value decreases, both for China and for the USA. Conventional wisdom is that reduced tensions also reduces the risk of war.
Aria@lemmygrad.mlto Memes@lemmy.ml•Amerikkka land of the jailed (But China is defo AuThOrITarIAn!!)7·24 days agoThe person just replying “Wrong” and refusing to elaborate 😂
Aria@lemmygrad.mlto Memes@lemmy.ml•Amerikkka land of the jailed (But China is defo AuThOrITarIAn!!)6·24 days agoold accounts, with maybe 1 comment per month
Isn’t that just lurking behaviour?
Aria@lemmygrad.mlto Memes@lemmy.ml•Amerikkka land of the jailed (But China is defo AuThOrITarIAn!!)17·25 days agoIt’s weird to me that this particular law was the one the colour revolutionaries rallied behind.
A Hong Kong resident confessed to having committed a murder on Taiwan. China extradites people summoned for court or with arrest warrants issued by the Taipei rebel government to Taiwan as long as it’s for non-political offences. So they would extradite this murderer to be tried on Taiwan.
Different parts of China have different laws, because it’s a big country with autonomous regions. Hong Kong, not that big, but for historical reasons have their own laws as well. If someone has an arrest warrant issued by one of the other Chinese governments, they will extradite the person to their jurisdiction. If it’s a different country, with which China has an extradition treaty, then they will extradite them to Beijing (the Chinese national government) and Beijing will send them to that other country.
Taiwan is neither a separate country, nor a Chinese government whose arrest warrants Hong Kong respects. But the guy confessed to murder. He should be tried. So new legislation is required to make it legal to extradite him to Taiwan, either directly or through Beijing.
That was the initial controversy.
Aria@lemmygrad.mlto Memes@lemmy.ml•Amerikkka land of the jailed (But China is defo AuThOrITarIAn!!)11·25 days agoestablish a platform discussing history of Tiennaman square or Uyghurs without strictly adhering to government set guidelines, then they will likely be prosecuted.
Tienanmen square has a 600 year history. You’re referring to one event, which is censored. But even that doesn’t cover the portion that is relevant to the history of Tiennaman, no part of the protests is censored. Uighur history also doesn’t have any censors.
It is true that you have been able to identify one censor in your two topics (albeit with inaccurate wording). It’s also a particularly sensitive topic with strong disinformation campaigns targetting it. In 2020, many states worldwide issued censors on COVID and vaccine related topics for similar reasons.
What is this a reference to? Who quoted Tucker Carlson?
Aria@lemmygrad.mlto Technology@lemmy.ml•Huawei enters the GPU market with 96 GB VRAM GPU under 2000 USD, meanwhile NVIDIA sells from 10,000+ (RTX 6000 PRO)1·27 days ago300i https://www.bilibili.com/video/BV15NKJzVEuU/
M4 https://github.com/itsmostafa/inference-speed-tests
It’s comparable to an M4, maybe a single order of magnitude faster than a ~1000 euro 9960X, at most, not multiple. And if we’re considering the option of buying used, since this is a brand new product and less available in western markets, the CPU-only option with an EPYC and more RAM will probably be a better local LLM computer for the cost of 2 of these and a basic computer.
Aria@lemmygrad.mlto Technology@lemmy.ml•Huawei enters the GPU market with 96 GB VRAM GPU under 2000 USD, meanwhile NVIDIA sells from 10,000+ (RTX 6000 PRO)3·27 days agoThat’s still faster than your expensive RGB XMP gamer RAM DDR5 CPU-only system, and you can depending on what you’re running saturate the buses independently, doubling the speed and matching a 5060 or there about. I disagree that you can categorise the speed as negating the capacity, as they’re different axis. You can run bigger models on this. Smaller models will run faster on a cheaper Nvidia. You aren’t getting 5080 performance and 6x the RAM for the same price, but I don’t think that’s a realistic ask either.
Aria@lemmygrad.mlto Technology@lemmy.ml•Huawei enters the GPU market with 96 GB VRAM GPU under 2000 USD, meanwhile NVIDIA sells from 10,000+ (RTX 6000 PRO)1·27 days agoI’m not saying you can deploy these in place of Nvidia cards where the tooling is built with Nvidia in mind. I’m saying that if you’re writing code you can do machine learning projects without CUDA, including training.
Aria@lemmygrad.mlto Technology@lemmy.ml•Huawei enters the GPU market with 96 GB VRAM GPU under 2000 USD, meanwhile NVIDIA sells from 10,000+ (RTX 6000 PRO)3·27 days agoI agree with your conclusion, but these are LPDDR4X, not DDR4 SDRAM. It’s significantly faster. No fans should also be seen as a positive, since they’re assuming the cards aren’t going to melt. It costs them very little to add visible active cooling to a 1000+ euro product.
Aria@lemmygrad.mlto Technology@lemmy.ml•Huawei enters the GPU market with 96 GB VRAM GPU under 2000 USD, meanwhile NVIDIA sells from 10,000+ (RTX 6000 PRO)1·27 days agoYou can run llama.cpp on CPU. LLM inference doesn’t need any features only GPUs typically have, that’s why it’s possible to make even simpler NPUs that can still run the same models. GPUs just tend to be faster. If the GPU in question is not faster than an equally priced CPU, you should use the CPU (better OS support).
Edit: I looked at a bunch real-world prices and benchmarks, and read the manual from Huawei and my new conclusion is that this is the best product on the market if you want to run a model at modest speed that doesn’t fit in 32GB but does in 96GB. Running multiple in parallel seems to range from unsupported to working poorly, so you should only expect to use one.
Original rest of the comment, made with the assumption that this was slower than it is, but had better drivers:
The only benefit to this product over CPU is that you can slot multiple of them and they parallelise without needing to coordinate anything with the OS. It’s also a very linear cost increase as long as you have the PCIe lanes for it. For a home user with enough money for one or two of these, they would be much better served spending the money on a fast CPU and 256GB system RAM.If not AI, then what use case do you think this serves better?
Aria@lemmygrad.mlto Technology@lemmy.ml•Huawei enters the GPU market with 96 GB VRAM GPU under 2000 USD, meanwhile NVIDIA sells from 10,000+ (RTX 6000 PRO)1·27 days agoCUDA is not equivalent to AI training. Nvida offers useful developer tools for using their hardware, but you don’t have to use them. You can train on any GPU or even CPU. The projects you’ve looked at (?) just chose to use CUDA because it was the best fit for what hardware they had on hand, and were able to tolerate the vendor lock-in.
Nah he’s saying he’ll order terrorist attacks against frustraters.