j. b. crawford

devops consultant. computer curmudgeon. author of Computers Are Bad.

  • 0 Posts
  • 3 Comments
Joined 2 years ago
cake
Cake day: May 31st, 2023

help-circle
  • People, especially enthusiasts of classical music, sometimes criticize John Williams for being a bit repetitive and over-bombastic. But, well, he’s first and foremost a film compose, and that’s sort of the nature of the genre. People tend to select film composers because they liked their previous work and want something very similar, so they tend to have a pretty consistent style. You see the same with other prominent film composers like Hans Zimmer. I guess what’s notable about John Williams is that he was such a prolific and popular film composer for such a long period that his style was the style of film scores for a generation. It’s hard to imagine an era of film, hits like Star Wars and Indiana Jones and Jurassic Park, without his bombast. His tendency to “go big” is part of what defined these epics.

    All that said, my favorite John Williams composition strays a little bit from film. It’s “The Mission,” better known as the theme for the NBC nightly news. The short theme is instantly recognizable to probably most Americans, but not so many have heard the full composition. NBC used to play it over credits at the end of the last news segment but I don’t think they do any more. Fortunately there’s a (potato quality) YouTube video of John Williams himself directing a Sony studio orchestra: https://www.youtube.com/watch?v=l7kIgcYgIQk


  • Increasingly I find it depending mostly on what the game was built for… I was raised, if you will, on PC point and shoots, and so my preference is for mouse and keyboard. But even a lot of AAA games these days that are console ports have noticeable pointer lag and aggressive reticule gravity or other aids. I find these really frustrating since they interfere with the 1:1 sense you get with motion on a mouse, so I’ll switch to a controller instead.

    Hogwarts Legacy is an example of a recent AAA release that has such heavy reticule gravity that sometimes the best strategy is to just hold an analog stick forward and not move it (e.g. in the broom races)… I hate this kind of thing but I feel like it’s something you put up with as a PC gamer due to the popularity edge the consoles have. At least it tends to be games where fast aiming isn’t a huge factor.


  • There’s an interesting aspect of this issue that I think the post summary really dismisses. Photos coming from phones these days sort of are AI, and in an annoyingly pervasive way.

    I’ve actually gone back from using my phone to using a proper camera again over the last year or so because I’m getting so irritated by the amount of ML-based post-processing my phone does. It results in a lot of photos looking bad, and there’s no easy way to bypass it besides setting the phone to save raw which sort of defeats the point of using the phone in a lot of ways (ability to go from taking the photo to posting on the device). A really common situation for me is when I take a photo with my phone that is blurry because of bad focus/shake/low light/some combination. The phone does really aggressive ML “sharpening” of the image that makes it look extremely artificial and, frankly, a lot worse than if the postprocessing had been omitted. I’ve had sets of photos I took totally ruined by this kind of “helpfulness.”

    It’s a tricky issue, there absolutely are benefits to cameras using the best technology available to create the best photograph available. I’m not meaning to appeal to some sense of artistic integrity or “real photography” here. I just hate the lack of control over the product. I used to be really into photography as a hobby and had a lot of opinions about lenses and mostly set up exposures manually. Nowadays I use my Sony Alpha with the kit lens and rarely take it off of its “smart” auto mode, which does have some ML-driven features like subject detection. But it feels like I have so much more control over the output than I do with my phone, because the Sony doesn’t run the image through ten layers of AI processing that’s not a whole lot better than the state of the art in Instagram filters before saving it. If I don’t hold the camera steady it’ll just come out motion blurred, not like someone new to photoshop has just discovered the posterize button.

    As I understand Apple is better than most of the Android vendors about this kind of thing and the iPhone processing probably produces better output - but it’s still frustrating to me feeling like photos are changing from “capturing the scene” to “recreating the scene.” I did graduate work on forensics of digital images, learned a lot of theory and methods for analyzing and reversing in-camera processing. I did some research on the “auto HDR” feature that was starting to appear in Android devices at the time and whether or not it defeated some known forensic methods for device fingerprinting (mostly, not totally). But that was the tip of the iceberg… it used to be that cameras only did a bit of processing, debayering for example, the kind of things that really need to be done to turn sensor data into a useful image because of the properties of the sensor and readout pipeline. But phones, the dominant photographic tool today, are taking it to this whole new level where they do what would have been very complex postprocessing on every image, as it’s taken.

    As with so many things, I guess it’s good when it works, but endlessly frustrating when it doesn’t. At least it feels like the phone vendors are doing their part to preserve “traditional” photographic technology, if that’s what you’d call a Sony mirrorless, by really nerfing phones as tools for people who want much control over the result. I do understand there are third-party apps for iPhone that expose a lot more user control but it seems like they also have some limitations with how much of the camera stack they can control/bypass.