

Which is why I asked, because we are in a linux comm here. I don’t put aside games that do shit on Windows, as long as they work fine on Linux.
Which is why I asked, because we are in a linux comm here. I don’t put aside games that do shit on Windows, as long as they work fine on Linux.
How would it work on Linux then?
My impression of Starfield (after release, at least) was, that it was a bunch of pretty well intended and implemented subsystems (as is, to my knowledge quite common in game development; each team works on a different one), but they just don’t fit really well together. All the subsystems are good parts of a theoretically good overall big picture, but the complexity seemed too high for them to actually flesh out the big picture.
Technically it all works, but IMO you feel the conceptual gaps whenever you transition (UX wise) from one gameplay mechanic to the next. It just doesn’t (or didn’t) feel like a cohesive game.
Which is completely reasonable. Insanity is trying the same thing over and over and expecting different outcomes.
It’s not like they tried nothing and are all out of ideas; they tried a lot and nothing stuck so far.
We recently had a funny problem. Our service ran fine, but a postgres upgrade failed because some pg internals were broken (broken ref ids). Dumping the DB also failed for the same error. Reading and writing was still fine, though. So we restored backup after backup… no dice. They all had the same issue: it was working for the service but we couldn’t perform any maintenance. Ultimately we had to “manually” dump the data of the service and replay it into a fresh db. That took quite long. But that was interesting, since even the verification of the backups didn’t help us notice that kind of corruption.
I am also a former TeX addict, but I was always more in favor or ConTeXt over LaTeX. And Typst is basically ConTeXt, but a lot faster (as in you get real time preview as you type).
Huh? What’s wrong with Overleaf?
If you “only” need beautiful PDF and it doesn’t have to be online, you can also use Typst with vscode and tinymist as editor locally. Not as powerful as TeX, but I know few people for use TeX even remotely to its fullest. The upside of Typst is, that the “core” syntax for content writing is very markdown-like, so you can focus on writing instead of the underlying language.
Server written in C++ and client in Java and Lua… now that’s an atypical combination. It still peaks my interest.
Backblaze B2 using Kopia
That would be so damn awsome, if I could finally play 4k 120Hz GfN on Linux.
I would rather bet that most people have no clue what an operating system is and that the one they (unknowingly) use is made by Microsoft. On the other hand if they play games (on that PC), they will know Steam, because they actively had to install it and click its icon frequently.
The linked ticket also references a merge request that went stale. So I would assume this is a good starting point (I haven’t looked at the MR though, so I don’t know how far off from the potentially accepted solution it is).
I don’t think there is a technical reason. Simply no one was interested in implementing it yet. See Nate’s answer over at reddit and the associated ticket.
So once someone is motivated enough it will happen. But without contribution or extreme boredom by the core mainteners (haha) it won’t happen.
It kind of is, unfortunately. Games are often developed with a lot of pressure and the constant dangling of the budget being cut off. I don’t think the devs are incompetent and think what they produced (code quality wise) would be the best, but what could they do if they need a result to present to the publisher end of week and then don’t get money (aka time) to clean it up but instead they get the next deadline.
On the other hand I am also not sure I can blame publishers. Things can easily spiral out of control if managed badly in the other direction… see Cloud Imperium Games (i.e. Star Citizen).
Yeah but it also shows the weird naming of WSL. It’s Windows (32) on Windows 64, but Windows Subsystem for Linux instead of Linux on Windows 64 (which would at least have fit the pattern).
btrfs because it was simple
Personally I found ZFS far more simple. The userspace tools make more sense to me. Also I like, that volumes can have a default (relative) mount point attached. So in a recovery scenario, I simply have to open the zpool with a relative base path, and then have all my volumes ready to go. If I want to recover a btrfs system with multiple subvolumes, I typically need to know exactly which ones and where to I have to mount them (each individually).
Also I go really used to zfsbootmenu
.
Microsoft really has a knack for that. I also like WoW64
, which contains the binaries for running 32 bit applications on Windows 64 bit. For historical reasons, the 64 bit binaries live in system32
, obviously.
KDE is one of the main reasons for me to use Linux. I immensely like the performance, silence and battery lifetime of MacBooks. But if I have to work with anything but KDE, it’s not worth it for me. The only thing OSX does better than basically any other desktop out there, is the ability to drag whole virtual screen between monitors.
I can understand Hellwig’s fear, though.
From what I gather as a bystander, it’s apparently common that a refactoring in your module that breaks its API will involve fixing all the call-sites to keep the effort on the person responsible for the change. Now the Rust maintainers say “it’s fine; if it breaks, we’ll deal with it” which is theoretically takes away the cross-language issue for the C-maintainer. Practically I can very well see, that this will still cause friction in the future.
Let’s say such a change happens and at that time there’s a bit of time pressure and the capacity on the rust maintainers is thing for whatever reasons. Will they still happily swallow that change or will they start to discuss if it’s really necessary to do that change? And suddenly, the C-maintainer has a political discussion on top of the technical issue they wanted to solve.
As someone who just wants to get shit done, I would definitely have that fear.
(That doesn’t mean it’s still a bullet not worth swallowing. The change overall can still be worth the friction. I am just saying that I think it’s not totally unwarranted that a maintainer feels affected by this even though current pledges from the other parties promise otherwise; this stance can change or at least be challenged over and over.)
Really? IMO not with GPUs. They have released linux drivers for decades, and always in time for new kernel versions. ATI was typically way behind and buggy as hell. I would likely not have switched to Linux on the desktop in 2006 if it wasn’t for my GPU “just working”, without any fiddling. Performance was always equal to Windows and stuff like multimonitoring just worked. They even had their nice setup utility to configure Xorg for you.
Could they have handled the transition to Wayland better? Maybe. But claiming they earned a bad reputation in regards to GPU when they are the one big vendor that had extremely active linux support for ages is dishonest and unwarranted, IMO.