News | Forum | People | FAQ | Links | Search | Register | Log in
Quake Custom Engines
Discuss modified Quake engines here, I guess. What engines do you use? What are the pros/cons of existing engines? What features would you like to see implemented/removed?
First | Previous | Next | Last
 
really? i wouldn't really let QME be a deciding factor at all. it's shit...
Best results with models always involves avoiding the use of QME in the workflow. 
 
backwards compatiblity is pointless if its going to just look terrible anyway

I tend to agree. The only reason to use 16bit or more accuracy is for large models with a large range of movement that *would* look terrible in 8-bit. If most users are going to see an awful mangled .mdl version of it anyway, then I would say the idea has failed. 
things like the vermis come to mind. or monsters that move around their model space a lot. 
Well 
It could be that using md3 makes more sense than a MDL + new 16bit refinement format.

I could have a look at how difficult a minimal md3 implementation for fitzquake/qs/glquake would be. 
Md3 
would be just awesome. 
Kinn We Get A Preview Of Your Models? 
Pwease ^_^ 
Ericw 
maybe, except with preach's idea, a 16 bit model would be fully compatible with even winquake/glquake. 
To Be Honest 
If you're gonna have a separate file, you may as well make it a MD3... 
 
True dat. Add MD3 support to the engine and boo-ya... 
 
If nothing else, md3 support saves creating a new standard. :P
Plus its already supported by a load of engines.


Software renderers might get something out of: http://sourceforge.net/p/fteqw/code/2002/tree/trunk/engine/sw/sw_model.c#l2908 but its too long ago for me to really remember much about it.

I'm tempted to port fte's mdl/md2/md3/iqm/zym/dpm/psk/md5 loader+renderer to glquake, but I doubt any engines would actually use any resulting patch which kinda makes it pointless, plus I'm feeling lazy. 
@Spike 
I'm tempted to port fte's mdl/md2/md3/iqm/zym/dpm/psk/md5 loader+renderer to glquake, but I doubt any engines would actually use any resulting patch which kinda makes it pointless

I touched on this above, but this is actually a great example of why certain features don't make it to widespread adoption.

You're right in that nobody would use the resulting patch, and the reason why is that the resulting can only be overly complex. 8 model formats and a renderer (I assume it's a single common renderer for all 8) is just too much. It touches too many parts of the code, and adoption would involve too much surgery, particularly if an engine author has already done work on these parts of the code themsleves.

In order to drive adoption features need to be easy to adopt. By way of contrast, an MD2 loader that brutalizes it into the same in-memory format as MDL (and they're not too different so it wouldn't be too difficult) would be just a simple matter of a drop-'n'-go loader, some simple struct modifications and another line or two here and there. That's the kind of feature that gets adopted. 
@mh 
1) but this is actually a great example of why certain features don't make it to widespread adoption

2) particularly if an engine author has already done work on these parts of the code themselves

In very recent times, I have become increasingly convinced that many programming languages --- especially C -- are broke by design.

At first, this sounds crazy. Because C is incredibly powerful and amazingly well constructed. And the assertion that C is "bad" means virtually every programming language is bad becomes most of them mimic the behaviors.

But I am convinced C is a terrible programming language (and almost all the ones we use):

Let's assume there is problem X and solution Y.

1. An "ideal programming language" (which probably does not exist at this time), the abstract solution does not get reduced to code. The abstract solution becomes a very specific implementation that drifts away from the abstract solution.

This is a characteristic of most programming languages -- and it is terrible.

2. C offers too many ways of doing things. Many of these ways are ridiculous. And coding towards a specific implementation vs. abstract solution is actually rewarded by the language. So you get stuff like this:

g = (byte)((((k & 0x03) << 3) + ((j & 0xE0) >> 5)) << 3);

3. A good example of C encouraging drifting away from the abstract solution is short-circuit evaluation. In a specific implementation, this might make sense but backing up a bit would deny a highly intelligent or advanced compiler from deriving intent. The order of logic in a statement using short-circuit evaluation doesn't obscure the logic --- it actually removes the information permanently -- so a highly advanced compiler could never know if the order was important or the programmers best guess of the right order for speed.

As a result the language doesn't offer:
1) Collective works
2) Additive gain
3) People spend their time reinventing the wheel.

And because the language offers so many coding styles and ways of doing things, you could draw an entire org chart of different styles.

This results in the kind of failures seen in Quake, where engines struggle to adopt features that have been common place in other engines for a decade --- because the abstract solution never gets reduced to code, but an implementation-specific solution.

[Of course, if a language is ever written that does this right, it will have to be written in C ;-)] 
@Baker 
You've more or less restated the problem that OOP was meant to solve.

The nirvana here was supposed to be that you could have components expressed as reusable objects and then all that you need to know is the public interface of that object and you can link to it and use it without having to worry about it's internal implementation.

Likewise if you're writing such a component you specify and document the public interface and the same magic happens for anyone using it.

At a sufficiently high level of abstraction using these objects becomes broadly analogous to putting Lego bricks together. Join a "model" object to a "renderer" object and you have support for a model format. Want to support another model format? Just drop in another object and if it understands the interface to the "renderer" (and vice-versa) you have the awesome.

True, the person writing such an object still has a lot more work to do, but that's a one-time-only job and then everyone gets to join the party.

Of course the end result we have today is somewhat different to this ideal world, but hey-ho, you can't win them all. 
 
http://www.cs.yale.edu/homes/perlis-alan/quotes.html -> " 93. When someone says "I want a programming language in which I need only say what I wish done," give him a lollipop. " 
But It Isn't True 
And you've been tricked! There may be 99 gems of actual wisdom on that page, but you've fallen for decoy with that one.

If two programmers will code essentially the same solution, the programmers are NOT coding.

Rather the programmers are, in fact, acting as the compiler.

This is a waste of human time. 
Well 
I think the point is more that observing that this is a problem does not bring us any closer to a solution. Languages like Haskell are finally getting more widely used these days so maybe there's hope for the world (although I found Haskell pretty hard for just getting things done when I used it, that was 15 years ago now though...). 
Baker 
The problem is this: A compiler is a transformation program that transforms one model to another. In the case of a C compiler, the input model is source code that conforms to the C spec, and the output is machine code that conforms to the spec of the targeted platform.

The input model is already an abstraction in that it abstracts (somewhat) from the details of the target platform. As you know, abstraction means omission of details, that is, every abstraction entails a (purposeful) loss of information. In the case of C, the missing information (platform details) can be added by the compiler because the information can be derived from the C spec and the platform spec.

What you want is even more abstraction, which again entails more loss of information. The question is, how can we get this information back during compilation? If you want an automated compiler from your more abstract input model, you need to store the missing information somewhere and add it back in during compilation. This is how toolkits work. For example, in a game toolkit like the latest Unreal Engine, you work on a much more abstract level, and the missing information (that is, how to render things, how to handle input, how to handle sound etc.) is added in when the game is assembled by the compiler.

Basically, the missing information is again stored in a platform specification. In this example, the platform specification is the specification of the engine, its toolchains, its file formats, and so on. Of course all of this abstraction comes at a price, and that is less flexibility. You are restricted in what you can do with the engine and how you can do it so that the "compiler" can add in the missing information.

But what you seem to want is having the ultimate flexibility AND a large degree of abstraction. You want to specify a very abstract solution and have the compiler figure out how to generate machine code from that. Again, the compiler needs to add the missing information, but where does it come from? If you formalize it somehow as a model, you must again limit the user / programmer in what she can do with your system.

And therein lies the problem: The more abstract your language, the more restricted it must be. Don't want do deal with memory representation? Use Java, but suffer the restrictions in performance and memory consumption. Don't want to specify types and such? Use Python/Ruby/whatever, but suffer the restrictions in performance and safety.

The challenge of abstraction is to balance it with flexibility, and apparently this is such a hard problem that the entire software engineering discipline has not been able to solve it (generally) and is nowhere near a solution, too. Some abstractions (languages) are "good" in this sense and others are not, but we don't precisely know why.

But! you say - can't the computer guess the missing information somehow? My brain can do it, so the computer should be able to, too! No, it cannot, because it's not creative and needs precise mathematical descriptions. Only our brain can do this.

But! what about Prolog and such? You don't even need to specify a solution, only a problem! Yes, but specifying complex problems so that a system like Prolog can derive a solution is (in general) much harder than inventing a solution in your head and coding that up, even if that's hard too. Logical programming is very useful for a specific set of problems, but it is more targeted at the scientific and engineering communities and are usually applied in AI and computer linguistics. You can't write a 3D game in Prolog (AFAIK), nor would you want to.

So, I for one don't believe that there will ever be a system that will write your code for you. Or if computers ever get "smart" in this way, I doubt we will be around to use them. Maybe our grandchildren will. 
@sleepwalker 
make a trenchbroom for C code!.. 
Laugh 
Imho C is a necessary and best kludge to get the most of out hardware and hence is the underpinning of serious programming. C++ is an even uglier but still great kludge for the reasons sleepwalkr says.

But I for one *love* my main language. Wish/Tcl is a beautiful archaic thing with aspects of functional programming,and great interfaces to a huge (slightly ugly) graphical toolkit, and to C/C++ as well.

wish <<!
pack [button .b -text "Func is.. " -command {.b configure -text Dead ; update ; after 1000 ; destroy .}]
MD3 YES? :) 
MD3 support for Fitzquake and Quakespasm!? :)
I for one can use this already... I'm using it to replace models for KMquake2 and currently working on a new Shambler model for the Original Quake. But ofcourse it ends up as an MDL in Quakespasm and Fitzquake... My pipeline goes from 3dsmax exported to FBX and then I simply use Neosis for the conversion to MDL right now. Works perfectly but the model formats lack of precision makes me sad. If somebody adds MD3 support I could pretty much instantly take advantage of it. I would still use the quake palette though because I prefer the pixel retro look.

IF you add it I will use it! Regardless

https://www.dropbox.com/s/ckylp58vipdh9e3/Shambler_Wip.jpg?dl=0

Here is a sneak preview for those wondering what it looks like. 
ACK Better Link 
Md3 
i've seen vertex dancing on md3 though... it's a lot less, but it's still there. is there no better format? 
Oh... 
what about something similar to external textures...
so you have your player.mdl
and then you have player_0.fbx, player_1.fbx, etc... and give external model information.

the workflow these days is to export multiple single frames and use tools to merge them into a .mdl anyway, so for the modeller, it's not adding any steps anyway. 
@mwh, Sleepwalkr, Etc. 
mwh:I think the point is more that observing that this is a problem does not bring us any closer to a solution.

I just want to prevent myself from writing bad code in the future (and I hope I have the definition correct and complete). And needed to know what "bad code" looks like --- bad code isn't obvious.

That is all I can do.

sleepwalkr:So, I for one don't believe that there will ever be a system that will write your code for you.

Is anyone actually writing any complex code to begin with? I don't know that ..

1) Anyone is.
2) If it is even possible to do so.

You've put tons of work into Trenchbroom, but isn't it all rendering triangles by distance at the end of a frame, when everything is said and done?

Developers: creators of simple works, using crude tools. Rewarded with praise for winning the endurance and stamina test using crude tools that are hard to understand, use correctly and master. The idea conceived in seconds, the debugging achieved in hours, but the implementation --- should it ever come --- will take months, if ever at all.

However, if you master a low-level language like C, you can do almost anything without any natural limitations except time. Knowing time is the limitation, I want to re-evaluate my use of it.

I do know the definition of insanity is doing the same thing over and over and expecting a different result.

I think this sums up current thoughts on the subject quite well: http://forums.inside3d.com/viewtopic.php?f=1&t=1283&start=2

I debated on clicking 'Submit'. Logic says no. The Labatt's Blue says "apathy now". And who am I to question the logic of beer! 
Too Bad To See A Nice Post Go To Waste 
Just a quick reply then:

You've put tons of work into Trenchbroom, but isn't it all rendering triangles by distance at the end of a frame, when everything is said and done?

Not by a long shot. 
First | Previous | Next | Last
You must be logged in to post in this thread.
Website copyright © 2002-2024 John Fitzgibbons. All posts are copyright their respective authors.