Jump to content

If 240 Seconds Equal 4 Minutes...


Recommended Posts

rdwpa said:

This thread is about whether a cyberdemon is a boss or an enemy.

He's a bonermy, obviously.

Share this post


Link to post
Enderkevin13 said:

This thread is irrelevant now.
The questions are answered, and
people are adding irrelevant info.

This answer is just hilarious.
You write down your sentences, and
you press Enter after a certain span.

Share this post


Link to post
Linguica said:

I can't tell if this is a troll or not.


I think it's called drugs and we need to get gez some help... Or get him to share so none of us realize just how far we've fallen after taking said drugs. That makes sense, right? Aww shit, now I need help.

Share this post


Link to post
Gez said:

A tic lasts 1/35th of a second, right? So how many milliseconds does that make? 28.571428571428571428571428571429... milliseconds. Problem is, not many OSes/libraries let you use floating point values for milliseconds in waiting and synchronizing functions, so a tic ends up lasting exactly 28 milliseconds. So 35 tics end up lasting only 0.98 seconds.


That's actually a legit concern. I never thought about it too thoroughly on how exactly such a timing interval is programmed -though with the way Doom handles timing, it can only decide if the last tic has elapsed or not, before going to processing the next. It can actually determine if more than one have elapsed, but it can't "interpolate" game processing in the case of having missed one or more tics, it can only skip rendering if it's lagging behind, while game processing will always be done fully at whatever speed the CPU is capable of doing (though it'd be cool to get a full render even with slo-mo action).

On Microsoft Windows, at least with versions up to Windows XP, it was a well-known limitation that GetTickCount only had a 15 or 16 ms resolution, so in a source port using that as a timing reference a "tic" could actually last anything from 30 to 32 ms. A more accurate version used since Windows 7 has a resolution of 7 us, while on many UNIX-like systems the resolution is 1 ms, so in practice the actual duration of a tic is always only an approximation of what it should be. It's a bit unsettling to realize that on most Windows ports tics may actually be may actually always be 5-7% longer than they should be O_o

I wonder what kind of timing reference was used in vanilla Doom: being DOS, certainly the programmers had more leeway of fucking with the 8253 PIT, or they could have simply gotten the timing from the VGA card, so it all depended on how accurate the "70 Hz" refresh rate actually was.

Edit: as an example, Chocolate Doom uses SDL_GetTicks for timing, which on Windows I understand it ultimately uses the timeGetTime function, which is also affected by the 15-16 ms limitation, just like GetTickCount.

Share this post


Link to post

#define VBLCOUNTER              34000           // hardware tics to a frame
#define TIMERINT 8
void (__interrupt __far *oldtimerisr) ();

/*
================
=
= IO_TimerISR
=
================
*/

//void __interrupt IO_TimerISR (void)

void __interrupt __far IO_TimerISR (void)
{
	ticcount++;
	_outbyte(0x20,0x20);                            // Ack the interrupt
}

/*
=====================
=
= IO_SetTimer0
=
= Sets system timer 0 to the specified speed
=
=====================
*/

void IO_SetTimer0(int speed)
{
	if (speed > 0 && speed < 150)
		I_Error ("INT_SetTimer0: %i is a bad value",speed);

	_outbyte(0x43,0x36);                            // Change timer 0
	_outbyte(0x40,speed);
	_outbyte(0x40,speed >> 8);
}



/*
===============
=
= IO_StartupTimer
=
===============
*/

void IO_StartupTimer (void)
{
	oldtimerisr = _dos_getvect(TIMERINT);

	_dos_setvect (0x8000 | TIMERINT, IO_TimerISR);
	IO_SetTimer0 (VBLCOUNTER);
}

void IO_ShutdownTimer (void)
{
	if (oldtimerisr)
	{
		IO_SetTimer0 (0);              // back to 18.4 ips
		_dos_setvect (TIMERINT, oldtimerisr);
	}
}

Share this post


Link to post

Cool info, Quasar! So since the PIT runs at 1.193182 MHz, dividing by 34000 gives about 35.094 tics/sec. It would be possible -in theory- to go closer to 35 tics than that, by using 34091 as a frequency divider (effective ticrate: 34.99991..) , but I'm not sure how the VGA timing is derived, perhaps they wanted to avoid conflict with that.

It would be interesting to make a list of how the various source ports derive their timing, and how accurate they are in their "tics" though. This may have implications regarding both game engine and player's performance. Having an extra 5%-7% real-world reaction time per tic when recording e.g. a COMPET-N class demo is not negligible O_o

Share this post


Link to post

I guess I should have said here that the timing issue I raised was valid for ZDoom, since Enderkevin13's questions were about ZDoom scripting. Other ports and vanilla use different timing methods.

I wouldn't be surprised if henceforth, ZDoom's "short tics" get systematically raised as Yet Another Reason ZDoom Is Wrong About Everything™ in Doomworld threads. :p

Share this post


Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...