#12335 posted by Lunaran on 2015/05/12 18:14:27
would it not make more sense to set lightmap scale per light instead?
you're right, this would, in fact, not make more sense. you're picturing only the use case where you put a torch in a little lantern cage thing to throw those neat lines, and you'd rather just tag the light to upres every surface it touches. this sounds like a recipe for winding up with tons of faces with far higher resolution than they need.
I'd be just as excited about selectively turning lightmap resolution DOWN, in areas where high ceilings are shrouded in darkness or on the backs of pillars or what have you where there isn't a lot of variation or contrast in the lighting that actually hits that face. I'm a pretty obsessive optimizer though.
Having control only per brush and not per face is a bit of a shame, but I understand it's a .map format limitation. Maybe we need a new, extensible .map format first :)
#12336 posted by JneeraZ on 2015/05/12 18:29:12
"this sounds like a recipe for winding up with tons of faces with far higher resolution than they need. "
But again, who does this affect? What machine is having a problem loading Quake levels because of texture/lightmap memory?
Turn that dial to 11, let's go! :)
What machine is having a problem loading Quake levels because of texture/lightmap memory?
Lots of them were until the recent version of QS.
There's a fallacy here that a 1996 engine can have no bottlenecks in the year 2015 - it absolutely can.
#12338 posted by JneeraZ on 2015/05/12 19:00:47
Because of texture memory? Not other limitations?
Some maps slow my surface down
#12340 posted by JneeraZ on 2015/05/12 19:10:09
:-|
To Be Honest
#12341 posted by Kinn on 2015/05/12 19:16:55
I'm kinda with WarrenM here.
If the whole map runs fine with increased lightmap resolution, I think I'd just go with that, rather than spending a ton of time setting up funcs all over the place where detailed shadows appear.
Hmmmm
#12342 posted by Kinn on 2015/05/12 19:23:03
Brainfart really
let's say the compiler does a hi-res lighting pass. After this pass, could you then do something a bit clever like automatically detect surfaces that have uniform or near uniform lighting and automatically downscale the lightmap resolution there as appropriate?
#12343 posted by Spike on 2015/05/12 19:31:23
@WarrenM, if you compile a box room and whack the lightmap scaling to 0.0625 (1:1), then fire a rocket, your framerate WILL plummet (you can use rtlight-style lighting to avoid any slowdown, or just use gl-style flashblends instead, in any current engine including vanilla... assuming you were getting those slowdowns from crazy lightmap resolutions in vanilla anyway).
throw in a load of flickering lightstyles and your framerate will stutter 10 times a second (this can be worked around like rmqe already does, although this is not without its own cost).
high lightmap resolutions are NOT without their cost, and using a scale lower than 0.25 (4 texels to each luxel) is probably going to be abusive unless constrained to specific surfaces. 0.25 should generally be okay though.
For reference, 0.0625 gives 176mb for just vanilla start.bsp. There is no way around the load times, if nothing else. Using that resolution on every single surface a light might hit is just impractical, but you can get away with it if its just one or two surfaces.
@necros, typically you want the lightmap resolution concentrated behind the fence texture (typically ground). spreading it around the entire light is going to be wasteful. see my remark about <0.25 being abusive.
it would be nice to auto-detect the surfaces that got light applied to them through a fence texture, or areas with hash shadows, possibly by just calculating it at a high res and then reducing the resolution if it doesn't have many harsh boundaries on it, but this is likely to result in either every surface being high-res or significant glitches between highres and lowres surfaces.
@ericw, presumably the "black" texture should have its lightmap scale set to a low resolution by default, as it shouldn't normally be significant anyway (really this should be surfaces that have all texels set to 0).
#12344 posted by JneeraZ on 2015/05/12 19:36:09
Maybe it could use the standard res lightmaps for dynamic lighting effects ... like, overlay them on the high res lightmaps. Odds are you wouldn't even notice.
#12345 posted by ericw on 2015/05/12 21:30:23
Kinn, I was thinking the same thing, automatically lowering the resolution when it won't be visible would be cool. Just need to try coding it.
Yeah
#12346 posted by Kinn on 2015/05/12 21:47:08
there's gotta be a pretty standard image processing algorithm that finds the minimum resolution for an image that still preserves its significant details.
#12347 posted by necros on 2015/05/13 02:27:36
my original suggestion was with the implication that all lightmaps would be high res, and only indicated lights would cast high res traces (because it is the traces that will take the most time). For other lights, they would put their low res maps (upscaled to match) on the high res lightmap. (like simply increasing a bitmap's size by 2x, maybe use some filtering to smooth out the results)
I have had light take several hours when doing final passes with -extra4 and all that other stuff.
increasing the number of traces to get high res light maps would increase that by order of magnitude, I would imagine...
unfortunately I hadn't considered how dynamic lights would work what with having to update all those lightmaps.
could there be 2 full lightmaps, one low res for dynamic and one completely static high res one and then just do some additive blending at run time?
also, for setting lightmap res on faces, would that just be a huge pain? If i set 1 light to be high res vs setting all the walls, doors, plats, whatever else in the room to high res?
#12348 posted by Spike on 2015/05/13 03:18:46
the problem with per-surface lightmap res is how you get it from the editor to the qbsp. it would require a new .map format instead of just new entity fields. and that would require editing all the map editors etc too, which becomes a significant undertaking.
I doubt that it would be that hard for the qbsp though.
#12349 posted by - on 2015/05/13 03:33:44
You could do it with the standard .map format by using texture names as flags... so a surface with the texture "*texturename*$lm256" or something like that would be flagged in the light compiler as using a 256x256 lightmap res.
Might be a slight pain to work with, but if you're only going to utilize the highres lightmaps in key areas, it shouldn't be too bad to set up
Pixel Journeys
#12350 posted by sock on 2015/05/16 23:19:46
#12351 posted by JneeraZ on 2015/05/17 00:08:03
Nice, I really like the lava river ... the organic shoreline with the structure of the pillars works really well.
Very Nice.
#12352 posted by Shambler on 2015/05/17 11:01:07
Also what Warren said, lava river is the one.
Spoogetastic Sock
#12353 posted by nitin on 2015/05/17 14:09:12
how big is this and when is it coming?
Duke Screenie
#12354 posted by quakis on 2015/05/18 01:44:52
Near completion with a few tidbits left to do.
http://i.imgur.com/EpzDPbq.png
Neat!
What editor are you using?
#12356 posted by skacky on 2015/05/18 07:59:34
Nice, I never say no to new Dook maps.
Searching For Link
#12358 posted by dogman on 2015/05/18 20:33:21
Hi, I was browsing this thread and came across a screenshot on some blog, I'm trying to find it again. It was of a fairly basic scene, with two views. One with void or dark area, the other with lava. I'm pretty sure it was from this thread, a few pages back. Anyone recall?
Arcane Dimensions?
#12359 posted by ericw on 2015/05/18 20:39:25
|