Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Interesting, didn't Rage pioneer the on-demand texture streaming approach? It fits well with the DMA SSDs in the coming console generation.

Wow, that's now a decade ago.



I believe it did, in addition to mega-textures in video games.

What I am hoping for, and been hoping for a while, is for game engines to start integrating AI into workflows. There are some tools out there that do leverage machine learning to some extent, but what I would love to see are tools for instance that can take a video shot of an actor and then infer the bone structure a decent degree and transfer that animation into the model. Or a tool that allows style transfer of an image onto a 3D model so we can have realistically dynamic brush tools for environments, also integrating Face generation GANs onto models to reduce sculpting effort. Not to mention tools that can dynamically and infinitely scale 3D models based on material information.

I know some tools exist that can do some of these things at an okay degree, but it can be taken even further.

Truly the power of AI in video game tooling has yet to be unlocked, but I believe video games as a medium is in the position of being able to push for practical applications of new and exciting research, second only to CGI films. It's exciting what's in store for the future and I'm sure Carmack can appreciate the kind of breakthroughs that Microsoft would be able to foster


Texture streaming was a thing before Rage.

ID tech implemented megatextures earlier also, Quake Wars heavily relied on that feature.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: