Hacker Newsnew | past | comments | ask | show | jobs | submit | eviks's commentslogin

Are there any creators that evolved and shoot at high frame rates that eliminate the need for motion interpolation and its artifacts or is the grip of the bad old film culture still too strong? (there are at least some 48fps films)

Most of the issues (like "judder") that people have with 24fps are due to viewing it on 60 fps screens, which will sometimes double a frame, sometimes triple it, creating uneven motion. Viewing a well shot film with perfect, expressive motion blur on a proper film screen is surprisingly smooth.

The "soap opera" feel is NOT from bad interpolation that can somehow be done right. It's inherent from the high frame rate. It has nothing to do with "video cameras", and a lot to do with being simply too real, like watching a scene through a window. There's no magic in it.

Films are more like dreams than like real life. That frame rate is essential to them, and its choice, driven by technical constraints of the time when films added sound, was one of happiest accidents in the history of Arts.


> Films are more like dreams than like real life.

Yes! The other happy accident of movies that contribute to the dream-like quality, besides the lower frame rate, is the edit. As Walter Murch says in "In the Blink of an Eye", we don't object to jumps in time or location when we watch a film. As humans we understand what has happened, despite such a thing being impossible in reality. The only time we ever experience jumps in time and location is when we dream.

I would go further and say that a really good film, well edited, induces a dreamlike state in the viewer.

And going even further than that, a popular film being viewed by thousands of people at once is as though those people are dreaming the same dream.


I would say that cuts are something we get used to rather than something that is intrinsically “natural” to us.

I remember when I was very little that it was actually somewhat “confusing”, or at least quite taxing mentally, and I’m pretty sure I see this in my own very little children.

As we grow and “practice” watching plays, TV, movies, read books, our brains adapts and we become completely used to it.


Cuts aren't "natural" but they're part of the language of filmmaking, and most peoples experience of them is consistent.

https://en.wikipedia.org/wiki/Kuleshov_effect#:~:text=The%20...


True. Maybe we experience jumps in time and location in our dreams because we've been conditioned to it by films.

> Most of the issues (like "judder") that people have with 24fps are due to viewing it on 60 fps screens

That can be a factor, but I think this effect can be so jarring that many would realize that there's a technical problem behind it.

For me 24 fps is usually just fine, but then if I find myself tracking something with my eyes that wasn't intended to be tracked, then it can look jumpy/snappy. Like watching fast flowing end credits but instead of following the text, keeping the eyes fixed at some point.

> Films are more like dreams than like real life. That frame rate is essential to them, and its choice, driven by technical constraints of the time when films added sound, was one of happiest accidents in the history of Arts.

I wonder though, had the industry started with 60 fps, would people now applaud the 24/30 fps as a nice dream-like effect everyone should incorporate into movies and series alike?


I have a 120 fps TV. Panning shots at 24 fps still give me an instant headache.

Real is good, it’s ergonomic and accessible. Until filmmakers understand that, I’ll have to keep interpolation on at the lowest setting.


It's not just the framerate mismatch, OLED's un-pulsed presentation with almost instant response time greatly reduces the perceived motion smoothness of lower framerate content compared to eg, CRTs or plasma displays

The same happens to me in cinemas at 24 fps, it’s not the display technology that is giving me headaches.

Not sure if it's the same thing, but nearly all cinemas are digital nowadays, and panning artifacts are absolutely still there

It’s happened to me since before any cinemas were digital. I only figured out why by trying to play games below 30 fps. At least for me, it’s definitely the frame rate.

It depends on the shot a lot too. Good DPs will use a panning speed that looks much smoother at 24fps.

Variable refresh rate displays are becoming popular in smartphones and PCs, hopefully this won't be a technical issue soon.

24 fps looks like terrible judder to me in the cinema too. I'm not afraid to admit it even if it will ruffle the feathers of the old 24 fps purists. It was always just a compromise between film cost and smoothness. A compromise that isn't relevant any longer with digital medium. But we can't have nice things it seems, because some people can't get over what they're used to.

>It was always just a compromise between film cost and smoothness.

I think the criticisms of The Hobbit when it came out in 48fps showed that it's not just that.


The 48 fps of The Hobbit was glorious. First time I have ever been able to see what is happening on screen instead of just some slide deck mess. There were many other things worth criticizing, but the framerate was not it.

That film had many problems, but the acceptable frame rate was not one of them. Most criticism wasn’t about that.

True but there was specific criticism about how the framerate made it far too easy to see the parts of the effects, sets and costumes that made it clear things were props and spoiled the illusion. Maybe we just require a new level of quality in set design to enable higher frame rates but it clearly has some tradeoff.

I think that’s definitely the case with 4K, and we’ve seen set detail design drastically improve lately as a response.

I don’t see how it’s the case for frame rate, except perhaps for CGI (which has also improved).

I think just like with games, there’s an initial surprised reaction; so many console-only gamers insisted they can’t see the point of 60 fps. And just like with games, it only takes a little exposure to get over that and begin preferring it.


Problem is modern OLED tv's, they have no motion blur so its a chopfest at 24hz (or 24fps content at 120hz) when you turn off all motion settings.

Yes, and records sound better than digital audio.

You've just learned to associate good films with this shitty framerate. Also, most established film makers have yet to learn (but probably never will) how to make stuff look good on high frames. It's less forgiving.

It'll probably take the next generation of viewers and directors..


James Cameron is one of the few who do this.

But the high FPS version is only in cinemas

It's unbelievable that we try so hard to solve this problem even after CRTs are extinct. Every LCD-type screen is easily made to refresh at any rate below its max. If we can't show a 24fps movie at 24fps on our TVs (or smoothly smoothed at 48fps)...what are we doing as a society? It's not like people think TV is an unimportant corner of their lives.

Considering that practically the only metric of economic success in the US oligarchy is the price of the flat-screen TV you'd imagine they'd at least work by now. At at least one price range.

I've got a "smart" TV that I didn't want, but that's the only thing they offer in my price range anymore. Maybe 5 years old. Stopped connecting to Wi-Fi, an actual hardware problem. Bricked. Opened the TV, cleaned the contacts and uncreased some wire strip. Has been working ever since. Most people would have thrown it out and bought another. But I'm the bad guy for using incandescent light bulbs.

Does the suspension break in games, which are not reality? Is there any evidence lower quality is better?

I think that whole complaint is just "people getting used to how it is". Games are just worse in lower framerate because they are interactive and because we never had 24 fps era, the games had lower framerate only if studio couldn't get it to run better on a given hardware

With one caveat, some games that use animation-inspired aesthetics, the animation itself is not smoothed out but basically ran on the slower framerate (see guilty gear games) while everything else (camera movement, some effects) is silky smooth and you still get quick reaction time to your inputs.


Games are supposed to be fun, input latency is not fun.

The same way you sell your disks: find a buyer, send them the game files, they send you the money

> find a buyer

this buyer would rather buy off GOG than you, unless you give a significant discount (and even then, the trust is hard to establish).

Therefore, even if you might have a legal right to re-sell (which you really don't unfortunately), the actual sale won't happen.


That's not relevant to the issue of "ownership"

Why do you think this contradicts anything? Heavy users hit a budget limit and continue consuming more via pirating.

You really need something way better than some shoddy survey to counter the obvious fact that price matters


Yeah but if a pirate would have not paid the full price why care? It is by definition not a lost sale, the most likely outcome is just an increase by one the player count

Because the price isn't binary? Also, the total spend isn't fixed either, it depends on how easy it's to pirate. So it's by definition still lost revenue, even if later/at reduced price

Consider the two cases

A: I pirated a game 25 years ago and played it after school

B: I didn't

which cases do you think will make me more likely to buy more versions of that game later?


Consider reality instead, you can make any fantasy case you want:

C. You didn't pirate, but played because your friends were deeply into it, so you skipped buying lunch to save money and pay for the game (pirating was hard for this specific DRM). You bought it at a discount on sale (remember, the price isn't fixed?). That feeling of overcoming hardship and friendship fused into a very positive experience, making it 10 times more likely for you to buy the next version than in A. or B. The overall likelihood still was tiny because now you have a family and don't have time to play, so that and

D. Considering the amount of uncertainty (your game company will go out of business in 25 years) the value of your "more likely" is $0


Not paying full price is not a "lost sale". People unwilling to pay full price wait for a discount or price reduction. Look at how popular the seasonal Steam sales are. Pirating the game very likely means they never purchase it at any price, which _is_ a lost sale.

It's only a lost sale if that person would otherwise have purchased it. At least in my personal experience that was _never_ the case.

There is more to this RE: perceived value of respective sides.

Edit: missed a word


It contradicts the post it was replying to, which was saying, effectively, that people don't want to spend any money on stuff.

I don't think it's required to be making some universal point when you clearly respond to the argument put forward in the post you reply to, do you?


No, you misunderstood the comment, it said that paying nothing is compelling, not that paying something was inconceivable or something; it was a response to a comment with a common misconception that pirating is only some "service problem"

I agree with your earlier comment (GGP) and feel like you're contradicting yourself here. "Too expensive" is either a service problem or at least directly adjacent to it. It's distinct from "well if I can get away with piracy then I'll do it". To say that free is a compelling price is to imply the latter as opposed to the former (at least imo).

the principles aren't sound

> To promote balanced usage, ... equal distribution eliminates the strain of overextending the right fingers

What overextension? You don't even type them frequently enough for your index/middle finger on the home row to notice anything, and "cognitive overhead" is lower if they're paired together.

And neither is this strategy

> we reach up for numbers,..This strategic approach ensures that my layout and daily typing tasks never overwhelm my cognitive load.

The default numbers are so inconveniently placed that you don't really get much proficiency in using them, so you'll not lose much if you switch from some great numpad layout back to horizontal line just like using regular numpad has no effect on your ability to use the horizontal row And numpad can't overwhelm anything since is extremely common

This is just bad strategy, using superficial logic to hurt ergonomics.

The familiarity with more rarely used symbols might add overhead if broken, but maybe if symbols are mapped to the same numbers it won't be much? (this is at least plausible unlike with the numbers themselves)


I love almost everything about the current revolution in keyboards (the mech switches, ergonomic layouts, and open-source designs), but I do think this arms race towards fewer and fewer keys is just getting ridiculous. Yes, you can use chords and layers, but at some point I think the cognitive overhead is outpacing whatever size and ergonomic advantages there could be, especially if you're a programmer and frequently need to type symbols from the weirder parts of the keyboard. Maybe people doing a lot of pure writing find them more useful, idk.

I think the same thing, and then I went a little smaller! I went to a large split then to a 58 key split, then to a 42 key split. At 42 I saw no advantage in going smaller other than it being smaller if you liked the look of it. Then I wanted to try a small dactyl and that lead me to an already designed 36 key split and I love it. I lost some more keys and found that I can easily handle that. I would not say that the move from 42 to 36 made it more ergonomic but not worse. While I went from 42 to 36 without thinking there were downsides, I think going any smaller does start to compromise functionality for the sake of form. At 36, I think that even on a bigger keyboard I would emulate the layout I have now as it is so easy.

This conclusion makes as much sense as saying software delivers no value because you've never personally seen an app without bugs

Or you make the tasks smaller and ban bathroom breaks, or replace the task with an equivalently challenging one if break is given. Or you check for phones before the exam. There are a lot of things that can be done.

> Apparently we were not registered at Troisdorf station, so we are on the wrong tracks. We cannot stop.”

And of course there is some huge fine or even potentially jail time if you moo in protest and pull that nice red lever to avoid the Christmas present of this bureaucratic idiocy (after all, you have legs that are capable of crossing train tracks and eyes to do that safely)?


It's for emergency.. if that happened to me I'd argue in court that I thought the driver went insane (because a system can't work like that) so it qualified as emergency..

But back to my country (Poland), it's better here - some had problems with physically getting out on the right station, and when the conductor saw it she even encouraged us to pull this lever in those cases so we don't have to get out at the wrong station.


Hyper-ventilate some, scream "Its too hot in here, I think I'm dying!" and presto-bango your very first panic attack and mental breakdown.

> For instance: In large codebases, consistency is more important than “good design”

But this is exactly the type of generic software design advice the article warns us about! And it mostly results in all all the bad software practices we as users know and love remaining unchanged (consistently "bad" is better than being good at least in some areas!)


I don’t know. At my place a lot of cowboy engineers decided to do things their own way. So now we have the random 10k lines written in Redux (not used anywhere else) that no one likes working with. Then there’s the part that randomly uses some other query library because they didn’t like the one we use in 95% of the code for some reason, so if you ever want to work with that code you need to keep two libraries in your head instead of one. Yes, the existing query library is out of date. Yes, the new one is better— in isolation. But having both is even worse than having the bad one!

GP is talking about "consistently bad" being worse than "inconsistently good". Not defending any inconsistency.

What you describe just sounds "inconsistent AND bad".


I didn’t really get into it, but I think that most decisions which are not consistent are made with some feeling of “I will improve upon the existing state of this ugly codebase by introducing Good Decisions”. I’m sure even the authors of the Redux section of my code felt the same way. But code with two competing standards, only one good, is almost always worse than code with one bad standard. So breaking with consistency must be carefully considered, and the developers must have the drive to push their work forward rather than just leaving behind an isle of goodness.

You're getting a lot of pushback in the comments here and I don't understand why. This is exactly right. Stay consistent with the existing way or commit to changing it all (not necessary all at once) so it's consistent again, but better.

Nobody is pushing back about "commit to changing all".

Nobody is denying that "inconsistent" can be bad on its own.

But you can't say that "inconsistent but good" is bad by providing an example of how "inconsistent and bad" is bad.


I don't know what to say.

That’s a logic error. The claim was that "inconsistent but good" can exist, not that "inconsistent == good". Responding with one example where "inconsistent" turned out badly is a totally different claim and doesn't refute what GP says.


Who said that I only had one example? I just listed one so you'd have an idea of what I was talking about. I could give you like a hundred. This is a heuristic I've developed over a lot of time working in codebases with inconsistencies and repeatedly getting burned.

I'm not disagreeing with your example and conclusion, and I've seen many of those.

I actually agree that half-assing a problem is not the best solution.

It's just that they are not examples of "inconsistent but good". They are not even "good", just "inconsistent". You said yourself that they're worse overall.


The author never really defines "consistency" anyway. Consistency of what?

I've never seen consistency of libraries and even programming languages have a negative impact. Conversely, the situation you describe, or even going out of the way to use $next_lang entirely, is almost always a bad idea.

The consistency of where to place your braces is important within a given code base and teams working on it, but not that important across them, because each one is internally consistent. Conversely, two code bases and teams using two DBs that solve the same problem is likely not a good idea because now you have two types of DBs to maintain. Also, if one team solves a DB-specific problem, say, a performance issue, it might not be obvious how the other team might be able to pick up the results of that work and benefit from it.

So I don't know. I think the answer depends on how you define "consistency", which OP hasn't done very well.


This is where an architect is useful, because they can ask "why?"

Sometimes there is a reason! Sometimes there isn't a reason, but it might be something we want to move everything over to if it works well and will rip out if it doesn't. Sometimes it's just someone who believes that functional programming is Objectively Better, and those are when an architect can say "nope, you don't get to be anti-social."

The best architects will identify some hairy problem that would benefit from those skills and get management to point the engineer in that direction instead.

A system that requires homogeneity to function is limited in the kinds of problems it can solve well. But that shouldn't be an excuse to ignore our coworkers (or the other teams: I've recently been seeing cowboy teams be an even bigger problem than cowboy coders.)


Ugh I remember a "senior" full stack dev coming to me with various ideas for the backend - start use typeorm instead of sequelize and replace nestjs with express, for the tickets they would work on, despite having no experience with any of these. The mess of different libraries and frameworks they left in the frontend will haunt that software for years lol.

It's essentially the same problem as https://xkcd.com/927/ [How Standards Proliferate]

So following that silly comic you'd ban utf-8 because it breaks consistency? (even though in reality it beat most other standards, not just became 15th)

This isn't really about software quality, it's about the entire organization.

Consistency enables velocity. If there is consistency, devs can start to make assumptions. "Auth is here, database is there, this is how we handle ABC". Possible problems show up in reviews by being different to expectation. "Hey, where's XYZ?", "Why are you querying the database in the constructor?"

Onboarding between teams becomes a lot easier, ramp up time is smaller.

Without consistency, you end up with lots of small pockets of behavior that cause downstream problems for the org as a whole.

Every team needs extra staff to handle load peaks, resulting in a lot of idle devs.

Senior devs can't properly guess where the problematic parts of fixes or features would be. They don't need to know the details, just where things will be _difficult_.

Every feature requires coordination between the teams, with queuing and prioritizing until local staff become available.

Finally, consistency allows classes of bugs to be fixed once. Fix it once and migrate everyone to the new style.


Yeah that line gave me a twitch. Reading on though it's more about the resulting coherence and correctness rather than like the Ralph Waldo Emerson quote: "A foolish consistency is the hobgoblin of little minds, adored by little statesmen and philosophers and divines."

I agree. It's only the foolish consistency that's problematic. A sensible consistency does, as you say, provide a coherence. William James, who overlapped Emerson, has a lot to say about positive habits.

My reading of it also violates the Boy Scout Rule. That is to say: if improving some portion of the codebase would make it better, but inconsistent, you should avoid the improvement; which is something that I would disagree with.

I think adherence to “consistency is more important than ‘good design’” naturally leads to boiling the ocean refactoring and/or rewrites, which are far riskier endeavors with lower success rates than iterative refactoring of a working system over time.


If improving a portion of the codebase makes it better, but inconsistent...

migrate the rest of the codebase!

Then everyone benefits from the discovery.

If that's difficult, write or find tooling to make that possible.

It's in the "if it hurts, do it more often" school of software dev.

https://martinfowler.com/bliki/FrequencyReducesDifficulty.ht...


The problem with small refactors over time is that your information about what constitutes a good/complete model of your system increases over time as you understand customers and encounter edge cases. Small refactors over time can cause architectural churn and bad abstractions. Additionally, if you ever want to do a programmatic rewrite of code, with a bunch of small refactors that becomes more difficult, with a single surface you can sometimes just use a macro to change everything all at once.

This is an example of a premature optimization. The reason it can still be good is that large refactors are an art that most people haven't suffered enough to master. There are patterns to make it tractable, but it's riskier and engineers often aren't personally invested in their codebases enough to bother over just fixing the few things that personally drive them nuts.


If improving some portion of the codebase would make it better, but inconsistent, you should avoid the improvement. Take note, file a ticket, make a quick branch, and get back to what you were working on; later implement that improvement across the whole codebase as its own change, keeping things consistent.

if you have some purported improvement to a codebase that would make it inconsistent, then it's a matter of taste, not fact, whether it is actually an improvement.

Consistency is best, with a slow gradual, measured movement towards 'better' where possible. When and where the opportunity strikes.

If you see a massive 50 line if/else/if/else block that can be replaced with a couple calls to std::minmax, in code that you are working on, why not replace it?

But don't go trying to rewrite everything at once. Little improvements here or there whenever you touch the code. Look for the 'easy wins' which are obvious based on more modern approaches. Don't re-write already well-written code into a new form if it doesn't benefit anything.


I feel like “be consistent” is a rule that applies very broadly.

There’s absolutely exceptions and nuances. But I think when weighing trade-offs, program makers by and large deeply under-weigh being consistent.


I have opposite experience. Consistency is commonly enforced in bigger corporations while it's value is not that high (often negative). Lots of strategies/patterns promoted and blindly followed without a brief reflection that maybe this is a bad solution for certain problems. TDD, onion/hexagonal architecture, SPA, React, etc.

Moreover, saying that consistency is more important than good design is like saying that eating leafy greens is more important than a good diet.

Yeah its called the expectations, consistently bad is predictable

software that has "good" and "bad" parts in unpredictable


> software that has "good" and "bad" parts in unpredictable

Software that has only "bad" parts is also very unpredictable.

(Unless "bad" means something else than "bad", it's hard to keep up with the lingo)


that's why I write the first parts of my comment

your example is just bad code that unpredictable


And I disagree.

My assertion is that software that has only bad parts is way more unpredictable than software that has both good and bad.

For multiple reasons: because "bad" is not necessarily internally consistent. Because it's buggy.

Unless, again, "bad" here means "objectively good quality but I get to call it bad because it's not in the way I like to write code".


So we should all write bad code to keep it predictable? raising the quality of the codebase is unacceptable under this premise.

Possibly. Probably even.

High quality and consistent > Low quality and consistent > Variable quality and inconsistent. If you're going to be the cause of the regression into variable quality and inconsistent you'd better deliver on bringing it back up to high quality and consistent. That's a lot of work that most people aren't cut out for because it's usually not a technical change but a cultural change that's needed. How did a codebase get into the state of being below standards? How are you going to prevent that from happening again? You are unlikely to Pull Request your way out of that situation.


"So we should all write bad code to keep it predictable?"

its true and false at the same time, it depends

here I can bring example: you have maintaining production system that has been run for years

there is flaw in some parts of codebase that is probably ignored either because

1. bad implementation/hacky way

2. the system outgrow the implementation

so you try to "fix" it but suddenly other internal tools stops working, customer contact the support because it change the behaviour on their end, some CI randomly fails etc

software isn't exist in a vacuum, complex interaction sometimes prevent "good" code to exist because that just reality

I don't like it either but this is just what it is


X, Y, Z are not public infrastructure, why should payments be any different?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: