• 18 Posts
  • 1.79K Comments
Joined 1 year ago
cake
Cake day: October 4th, 2023

help-circle

  • One other factor that I think is an issue with motion blur: the modeling of shifting gaze in video games often isn’t fantastic, due to input and output device limitations.

    So, say you’re just looking straight ahead in a game. Then motion blur might be fine – only moving objects are blurred.

    But one very prominent place where motion blur shows up is when the direction of your view is changing.

    In a video game, especially if you’re using a gamepad, it takes a while to turn around. And during that time, if the game is modeling motion blur, your view of the scene is blurred.

    Try moving your eyeballs from side to side for a bit. You will get a motion-blurred scene. So that much is right.

    But the problem is that if you look to the side in real life, it’s pretty quick. You can maybe snap your eyes there, or maybe do a head turn plus an eye movement. It doesn’t take a long time for your eyes to reach their destination.

    So you aren’t getting motion blur of the whole surrounding environment for long.

    That is, humans have eyes that can turn rapidly and independently of our heads to track things, and heads that can turn independently of our torsos. So we often can keep our eyes facing in one direction or snap to another direction, and so we have limited periods of motion blur.

    Then on top of that, many first person shooters or other games have a crosshair centered on the view. So aiming involves moving the view too. That is, the twin-stick video game character is basically an owl, with eyes that look in a fixed position relative to their head, additionally with their head fixed relative to their torso (at least in terms of yaw), and additionally with a gun strapped to their face, and additionally, with a limited rate of turn. A real life person like that would probably find motion blur more prominent too, since a lot of time, they’d be having to be moving their view relative to what they want to be looking at.

    Might be that it’d be better if you’re playing a game with a VR rig, since then you can have – given appropriate hardware – eyetracking and head tracking and aiming all separate, just like a human.

    EDIT: Plus the fact that usually monitors are a smaller FOV than human FOV, so you have to move your direction of view more for situational awareness.

    https://old.reddit.com/r/askscience/comments/gcrlhn/what_fov_do_humans_have_like_in_video_games_can/

    Human field of view is around 210 degrees horizontally. Each eye has about 150 degrees, with about 110 degrees common to the two and 40 degrees visible only to that eye.

    A typical monitor takes up a considerably smaller chunk of one’s viewing arc. My recall from past days is that PC FPS FOV is traditionally rendered at 90 degrees. That’s actually usually a fisheye lens effect – actual visible arc of the screen is usually lower, like 50 degrees, if you were gonna get an undistorted view. IIRC, true TV FOV is usually even smaller, as TVs are larger but viewers sit a lot further away, so console games might be lower. So you’re working with this relatively-small window into the video game world, and you need to move your view around more to help maintain situational awareness; again, more movement of your direction of view. A VR rig also might help with that, I suppose, due to the wide FOV.



  • Motion blur is a win if it’s done correctly. Your visual system can make use of that blur to determine the movement of objects, expects it. Move your hand quickly in front of your eyes – your fingers are a blur.

    If you’ve ever seen something filmed at a high frame rate and then played back at a low frame rate without any sort of interpolation, it looks pretty bad. Crystal-clear stills, but jerky.

    A good approximation – if computationally-expensive – is to keep ramping FPS higher and higher.

    But…that’s also expensive, and your head can’t actually process 1000 Hz or whatever. What it’s getting is just a blur of multiple frames.

    It’s theoretically possible to have motion blur approaches that are more-efficient than fully rendering each frame, slapping it on a monitor, and letting your eye “blur” it. That being said, I haven’t been very impressed by what I’ve seen so far in games. But if done correctly, yeah, you’d want it.

    EDIT: A good example of a specialized motion blur that’s been around forever in video games has been the arc behind a swinging sword. It gives the sense of motion without having to render a bazillion frames to get that nice, smooth arc.







  • If that email is actually from Logitech, it probably has some way to unsubscribe. Might have added you for some nonsense reason like a warranty registration, but I’ve never hit problems with a reputable company not providing a way to unsubscribe.

    The random scam stuff…yeah, probably can’t do much about that.

    One possibility I’ve wondered about is whether, someday, email shifts to a whitelist-based system. I mean, historically we’ve always let people be contacted as long as they know someone’s physical address or phone number or email address, and so databases of those have value – they become keys to reach people. But we could simply have some sort of easy way to authorize people and block everyone else. In a highly-connected world, that might be a more reasonable way to do things.



  • Looks fine to me.

    Little side question: Will the Wi-Fi and Bluetooth on the motherboard work in Arch? From what I could gather, the drivers for it should be in the latest kernel, but I’m not 100% sure.

    If they don’t for some reason and you can’t get it working or need some sort of driver fix, can always worst case fall back to a USB dongle or similar until they do. Obviously, preferable not to do that, but shouldn’t wind up stuck without them no matter what.




  • Alexey Pajitnov, who created the ubiquitous game in 1984, opens up about his failed projects and his desire to design another hit.

    He prefers conversations about his canceled and ignored games, the past designs that now make him cringe, and the reality that his life’s signature achievement probably came decades ago.

    The problem is that that guy created what is probably the biggest, most timeless simple video game in history. Your chances of repeating that are really low.

    It’s like you discover fire at 21. The chances of doing it again? Not high. You could maybe do other successful things, but it’d be nearly impossible to do something as big again.


  • The downside of building the phone/tablet into the car, though, is that phones change more quickly than cars.

    A 20 year old car can be perfectly functional. A 20 year old smarphone is insanely outdated. If the phone is built into the car, you’re stuck with it.

    Relative to a built-in system, I’d kind of rather just have a standard mounting point with security attachments and have the car computer be upgraded. 3DIN maybe.

    I get the “phone is small” argument, but the phone is upgradeable.

    And I’d definitely rather have physical controls for a lot of things.




  • Plus, even if you manage to never, ever have a drive fail, accidentally delete something that you wanted to keep, inadvertently screw up a filesystem, crash into a corruption bug, have malware destroy stuff, make an error in writing it a script causing it to wipe data, just realize that an old version of something you overwrote was still something you wanted, or run into any of the other ways in which you could lose data…

    You gain the peace of mind of knowing that your data isn’t a single point of failure away from being gone. I remember some pucker-inducing moments before I ran backups. Even aside from not losing data on a number of occasions, I could sleep a lot more comfortably on the times that weren’t those occasions.


  • That’s not a completely reliable fix, a third party library could still call setenv and trigger crashes, there’s still a risk of data races, but we’ve observed a significant reduction in SIGABRT volumes.

    Hmm. If they want a dirty hack, I expect they could do a library interposer that overrides setenv(3) and getenv(3) symbols with versions that grab a global “environment variable” lock before calling the actual function.

    They say that they’re having problems with third party libraries that use environment variables. If they’re using third-party libraries statically-linked against libc, I suppose that won’t work, but as long as they’re dynamically-linked, should be okay.

    EDIT: Though you’ve still got an atomic update problem with the returned buffer, doing things the way they are, if you don’t want to leak memory. Like, one thread might have half-updated the value of the buffer when another is reading the buffer after returning from the interposer’s version of the function. That shouldn’t directly crash, but you can get a mangled environment variable value. And there’s not going to be guarantees on synchronization on access to the buffer, unlike the getenv() call itself.

    thinks

    This is more of a mind-game solution, but…

    Well, you can’t track lifetime of pointers to a buffer. So there’s no true fix that doesn’t leak memory. Because the only absolute fix is to return a new buffer from getenv() for each unique setenv(), because POSIX provides no lifetime bounds.

    But if you assume that anything midway through a buffer read is probably going to do so pretty soon, which is probably true…

    You can maybe play tricks with mmap() and mremap(), if you’re willing to blow a page per environment variable that you want to update and a page of virtual address space per update, and some temporary memory. The buffer you return from the interposer’s getenv() is an mmap()ed range. In the interposer’s setenv(), if the value is modified, you mremap() with MREMAP_DONTUNMAP. Future calls to getenv() return the new address. That gives you a userspace page fault handler to the old range, which I suppose – haven’t written userspace page fault handlers myself – can probably block the memory read until the new value is visible and synchronize on visibility of changes across threads.

    If you assume that any read of the buffer is sequential and moving forward, then if a page fault triggers on an attempted access at the address at the start of the page, then you can return the latest value of the value.

    If you get a fault via an address into the middle of the buffer, and you still have a copy of the old value, then you’ve smacked into code in the middle of reading the buffer. Return the old value.

    A given amount of time after an update, you’re free to purge old values from setenv(). Can do so out of the interposer’s functions.

    You can never eliminate that chance that a thread has read the first N bytes of an environment variable buffer, then gone to sleep for ten minutes, then suddenly wants the remainder. In that case, you have to permit for the possibility that the thread sees part of the old environment variable value and part of the new. But you can expend temporary memory to remember old values longer to make that ever-more unlikely.