#101 posted by h on 2018/09/21 01:43:55
players don't remember models or formats, they remember maps.
#102 posted by
khreathor on 2018/09/21 02:58:59
the 2 posting crap at me were clearly Kinn and/or khreathor, if you want to find out just check ip logs.
Pls... I'm not a 13yo small dick troll, who has to hide behind anonymity to insult people... I can do that while being logged-in :D
one of you suggested a poly-count restriction on the engine to prevent new things. great idea if you apply it to the mappers too and not just the modellers.
Yeah but maps still keep retro look, they are just getting bigger (in most cases). Their "poly" density feels right for retro stuff.
It won't happen with models. You'll build 50k tris model, put 2k texture on it, but entity size in game will stay the same. It will look off imo, with weird pixel and polygon density. Same thing happens when you put 4k textures on a low poly level, it doesn't feel right.
i prefer to expand limits to make quake better
We can expand engines like FTE and keep QS lightweight for stuff more close to vanilla. I don't see any problems here, especially when FTE is good for TC mods.
Tbh best option would be porting Quake to Unreal4 and then you have all eye candy stuff and all modern tools available.
i think without question the best model format would be either FBX or collada
COLLADA is an intermediate XML format which gives big ASCII files and you have to parse it to get data, you can't just load it like a binary file.
FBX has binary format, but there is a problem with license when you are using it in open source projects. That's why Blender has some bastardized FBX importer/exporter which can't handle half of the features.
Kreathor
#103 posted by
Kinn on 2018/09/21 10:18:17
I think he's whole shtick was "Geez guys why isn't it possible yet to make something that looks like UE4 in a Quake source port? Pfffttt" - referring not only to characters, but environment as well. Realtime radiosity lighting, raytracing blah blah blah.
The answer of course, is the same answer every time this comes up, which is "Use an engine that's designed to do all that modern stuff, like UE4, but you'll probably have to wait for Unreal Engine 5 for the realtime raytracing stuff, I don't think 4 does that yet."
@khreathor
I suggested the poly count restriction. Something like md2 format. The idea being to gradually increase the quality level so that the models keep up with the other things that the engine is capable of, without being over the top of course. What other quality of life stuff could other model formats bring us?
So a more philosophical question then... Is it better to have the tools and limit yourself?, or is it better to have the tools limit you? Because I can see both arguments and I don't know if there is a right answer.
I made a thing because I'm curious.
https://www.strawpoll.me/16504178
#105 posted by
Esrael on 2018/09/22 10:30:31
I support the notion of having the tools but the mapper limiting himself/herself.
Maybe the tools could have "soft" limits, though, meaning that they have the option of warning the user when some limits are exceeded but still having support for higher/unlimited limits.
That way users don't need to be stifled but still be informed when they're about to cross the line.
What Kinn Said...
#106 posted by mh on 2018/09/22 22:02:55
The standard answer to "I'm making my own game and I want a custom Quake engine that has all of these features that $OTHER_ENGINE has" is: "why don't you just use $OTHER_ENGINE instead"? - It'll save yourself and others a lot of pain and suffering.
What's sometimes difficult to get people to understand is that the cost of a new feature is not just the time to initially implement it.
As well as coding it up, testing it, debugging it, integrating it, and (optionally) documenting it, you also need to maintain it and ensure that it co-exists peacefully with any other new features that might also be implemented in future.
This increases complexity on an exponential scale, and it's well-enough known that people can have difficulty fully appreciating how quickly exponential scales explode.
Take Nehahra as an example, because it's a good one. Few engines support Nehahra, and the reason why is that it's a pain to integrate with other engine features that have since become standard.
To be more specific take Nehahra fog. Nehahra uses a different fog algorithm to what is now standard, and while that's a simple-seeming case, what it actually means is that anything you do that touches or interacts with the fog code now has to be tested twice. And because virtually everything does that, a simple feature has effectively doubled part of your workload.
Now implement Q3A-style fog volumes and watch your workload triple, as well as being in a position where you have 3 different fog systems and you need to sort out how they interact with each other.
That's why when people say stuff like "Can I have features X, Y and Z, Source and Unreal have them" the appropriate answer is "well go use Source or Unreal then".