Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sorry but you keep talking about timing. With network speeds nowadays, the audio files get completely or almost completely buffered in the PC from whatever source you are reading it (NAS for example), and then played. Same with video (even when it doesn't buffer the entire video). And video/audio frames are interleaved.

Is not that every packet is "output" as soon as it arrives to the PC.

Or I didn't understand what are you saying.



As an example, let's say you're playing a video.

A playback device creates a stream of audio and video data that is consistent and together. But if there's a network involved, say you have stereo wifi speakers, then the video and audio streams are separated. Maybe the playback device sends the data to the display using hdmi, and that device sends data over the network to the speakers. Now there are two delays - one for decoding and presenting video, and a second separate one for transmitting and then decoding audio. The second one is variable depending on the network.


These are old and solvable problems (audio sync). For example I'm using HDMI ARC to output the audio of my TV, and the TV has a menu to coordinate video/audio (by inserting a delay). In my case I don't need it.

But it's delay, and not integrity of the data being played back. If the network sucks and there are retransmissions, your data might still be integrous without loses (depending on the protocol).

It also happens for musicians on stage. They play and after some delay they hear their own sound back. So they use monitors to hear themselves and the rest of the band.

While in lockdown people gathered together to play music with special software. That tells you something about how this problem is well known and how it can be solved in every situation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: