Jump to content

saberhawk

Member
  • Posts

    187
  • Joined

  • Last visited

  • Days Won

    2
  • Donations

    0.00 USD 

Everything posted by saberhawk

  1. A new asset plugin is somewhat required. Let's dig in on the engine details a bit before we call things giant unwieldy mega systems with no merit, shall we? So, skinned mesh rendering? The engine originally shipped with CPU-side rigid (i.e. one bone) skinning. It was stuck at this for a while because the contemporary exporter only supported rigid skinning, which doesn't work great for actual skin. The BFME version of the exporter added support for two-bone soft skinning by duplicating the rigid skinning system and storing an extra set of vertices optimized for rigid skinning along with the extra bone IDs and weights required. This is incredibly memory bandwidth and space intensive and doesn't scale very well to soft skinning with more than two bones. It does look better though, and the engine was updated to read and use this "new" type of skinning data. You are. The purpose of WWskin was to make the content in the content authoring tool behave like content in the engine does. Things like inverse kinematics (i.e. feet following terrain) require engine level support when not faked via baked hierarchy transform animations. Which is what mocap data generally is, until you get into facecap, and already supported by the engine. k. PS: Whatever you do, don't log how long MeshModelClass::get_deformed_vertices(Vector3* dst_vert, Vector3* dst_norm, uint32 stride, const HTreeClass* htree) takes as it hits the slow weighted skinning case many times each frame in a multiplayer match. Getting rid of confusing features like the W3D fixed function emulation übershader generator would really help. An audit was performed on a large library of .w3d files and per-vertex specular lighting wasn't found, so I didn't bother to both create test cases and implement accurate emulation of those test cases to prevent the übershader from exploding much more than it already had. I wish I knew what all the various material settings really did. The options in the exporter GUI don't always match up with the data contained in exported files. Old exported files need to keep working, they don't just disappear. Exported files contain lots of features not exposed by the exporter, such as alternative materials. The rendering system runtime data is very different than the stored data. For example, all texture coordinates are stored upside down and need to be flipped. Why? Because Renegade originally used SurRender 3D which followed OpenGL math conventions instead of Direct3D ones. These quirks in the file format can't really be fixed without updating all the required related tools, like the level editor. Which we couldn't really update because the GUI used MFC6 which is now two decades obsolete and incompatible with newer compilers. The engine also performs plenty of fixups for "oh noes, we exported the file but the settings are wrong and we can't export a new version" and fog. After all of that, the post-processed mesh finally makes it to the step of generating #defines for the shaders that it needs to compile to render triangles with that particular combination of settings. This happens with the source code found here. https://gist.github.com/saberhawk/199235c38a2b1051f06f0340cec9b5b4#file-ff_generator-cpp-L22. The rest of the gist contains the übershader code. There are some optimizations to prevent compiling a particular shader twice and to load compiled shaders instead of compiling them as required, but there are still a ridiculous number of combinations. And then the actual D3D11 rendering engine just uses whichever shader (i.e. compiled .fx code, not the "shader" in the W3D exporter) and a combination of constant buffers and textures, some static and some dynamic. Which is so much more powerful than the limited options from the exporter, and what those options eventually end up in anyways so they can work. Is it really that much of a dream for tools to just use that directly instead of maintaining the fixed function emulation übershader generator?
  2. A fair question, we don't know yet. The game should have created crashdump files and stored them at %USERPROFILE%\documents\W3D Hub\games\apb-release\debug. If you compress (zip/rar/7z/bz2/whatever) the most recent crashdump files and upload them here we can take a look.
  3. That's good, because it's still quite new
  4. What engine limitations? If we managed to overcome the primary challenges, we'd already be well equipped to handle any platform differences. Consoles do work differently though; something like the launcher wouldn't pass technical requirements.
  5. No, because it's packed into always.dat. Why would any other file packed into alway.dat be fine?
  6. The current launcher is unlikely to ever support Linux as it's a WPF application. http://www.mono-project.com/docs/gui/wpf/
  7. Changing settings in engine.cfg or through the launcher? Both apparently. I tried what saberhawk suggested and it didn't work. engine.cfg changes definitely affect the game. Is it possible that you are using an older version of Bandicam that doesn't support capturing D3D11 games?
  8. Something you could try is changing the fullscreen mode to 2 in Documents\W3D Hub\games\apb-release\engine.cfg.
  9. Or you can just go in game to Options->Configuration->Performance, select Expert Mode and change the Texture Filter?
  10. It is the latter. While it would be a neat feature, NVIDIA and Oculus have both released SDK's in order to help developers create new games for the HTC Vive and Oculus Rift. From what I've seen, the Oculus rift also takes advantage of motion controls as well. That would mean an entirely new animation system, as the current one is so rigid it requires button presses to change animations. While I don't doubt the skill of the guys at tiberian technologies, I seriously doubt a complete, VR-ready animation system is in the works or even in the scope of their scripting knowledge. And if it was to favor the HTC Vive as opposed to the OR, you would want to make sure that the motion controllers could give feedback if used. Shit, I'd like to see a limited, adjustable aiming deadzone put in place (not that I would use it, but it's a nice feature). Long story short, while definitely possible, it would take too much development time. Without some sort of grant from EA to start working it, there would be no reason to do it. What most people think about modding and "scripting" with this engine is wrong, especially when it comes to recent iterations ("scripts 5.0"). Yeah, there's probably a feature where a button press will play an animation (on what?) in a networked game. Some people use it. It's far from the actual animation system capabilities, it's an old "script" using a hacky interface put in place about a decade ago. Before we, you know, added a decade's worth of code and all the man-years of previous thought in the designs we cloned. Like all the one the ones required for defining the concept of a "script" in the first place, the input system to detect that button press, the netcode to send it to the server and synchronize clients, and the animation system to actually play it by sampling animation curves and feeding the also cloned/rewritten/rewritten again/and again rendering engine. All of these with many of our original hacks in place for backwards compatibility with ourselves and some other closed source projects and shared between a huge number of mods. Things are "laggy" because too much gameplay is written literally in scripts.dll in "script" form which is all server side scripting. Things are "ugly" because the tools suck and can't feed the engine better data, and I'm working as fast as I can to make better ones. I wish people would/could help, but they usually don't so I prioritize whatever I feel like doing. Taking a break from thinking about VR systems (which is part of my day job) and working on a simple 3D engine is supposed to be relaxing. Always, but it's going to be much broader than what is traditionally thought of as gaming and adding in other forms of entertainment and communication. It's also incredibly useful when it comes to jobs that deal with designing physical objects. Designs for "things" are growing increasingly virtual and the ability to visualize and manipulate those designs directly and easily is incredible.
  11. It's not much work to remove the one line or so of code that updates the camera direction at all. Using head tracking is slightly more complicated because nobody seems to agree on directions; is +Z up, into the screen, or out of the screen? What's actually complicated is getting the engine to render twice as much content to a 2560x1200 screen at 90FPS. Failure to maintain frame rate could get people physically sick. Sudden camera movements need to be avoided, otherwise people get physically sick. Each eye uses a slightly different perspective, so any tricks that assume a single perspective need to be fixed, otherwise they will look weird and could also make people sick. A few things that immediately come to mind are certain particle systems, "billboards" (camera aligned objects), and the sky actually being a tiny sphere that's shrinkwrapped to the camera and nowhere close to where it should be physically. After people can finally stay inside the virtuality for more than a few minutes, then comes the matter of controls and redesigning the HUD. The old trick of "put it in the corners" doesn't work anymore, as those are almost entirely outside your field of view and very difficult to see. Depth contrast also needs to be introduced because it's a lot more effective than color contrast. The trickiest thing will likely be compelling and fair gameplay for VR players vs non-VR players. A 24 player VR-only match would be cool, but that's years away realistically. This only holds true if you assume the desired return is something like X new players or Y more sales because of that VR mode. It's very likely that there are only a few players with VR capable computers, and APB is a free project. But it's run by volunteers. Some of which might see the experience of adding support for emerging VR technology to an existing game engine as rather valuable, even if only a handful of other people will see it at first. Kinda. SteamVR has something called "Desktop Theater" which simulates a large screen for you to play games with. Potentially IMAX sized, pretty epic. That's probably the closest that you can get to running any existing game in VR and it's likely that it'll support a fake VR-like mode for games already supporting stereoscopic 3D rendering. But you can only count on it for the HTC Vive right now since it's still very early in the VR wars; it's unclear if Oculus will support SteamVR.
  12. A mission in the SP campaign was Havoc infiltrating a Nod ship.
  13. If anybody tries that, I'll change the code reading/using stylemgr.ini to break compatibility. Because that's the better place to change it in the first place.
  14. This is news to me since I specifically avoided using simultaneous multiple render targets due to lack of hardware adoption. Which Intel GPU are you using?
  15. Thanks for all the code you've contributed over the years. I thought you would have noticed by now that the 5.0 codebase uses features like non-static data member initializers, variadic templates, static_assert, delegating constructors, and deleted functions; all of which are new to VS2013, still work with the XP toolchain, and are rather useful. Something else used that's incredibly useful is D3D9Ex; it simplifies video resource handling significantly by not requiring you to keep track of all resources to recreate them from scratch on alt-tab. Unfortunately that requires an operating system written in the past decade, go figure.
  16. This. Have often seen performance improvements just by cloning old code from optimized Renegade binaries.
  17. Unfortunately GDI command insists on using outdated fire-control systems from 2002 even though it's 2030 and technology has advanced since then. We are trying to convince them otherwise, but these sort of changes take time.
  18. Not in time for this upcoming release.
  19. I consider FPS issues to be things like performing significantly worse than other games that look much better on the same hardware. Reborn has a ton of them. As for core system specs, I have a Core i7 3930k, 2x GTX 670, 16GB of RAM, and a 256GB Vertex 4. Those easily come up to $2500 worth of hardware, and it's not counting things like the case, monitors (3x 24" Dell U2410) or other internal hardware (like the other 6 drives, RAID controller, and sound card.)
  20. I work on engine efficiency improvements because I do have a "$5,000" "gaming" rig and get poor FPS
  21. What I'm saying is to discuss the original ideas with the programming team instead of ones that were already shoehorned into perceived engine limitations. They will invariably know more about current engine capabilities and how to expand them or work around them appropriately.
  22. This is exactly what I'm talking about. Rather than actually stating the problem ("As a player using binoculars, I would like to mark targets for my team members"), everybody jumps right into various solutions that draw upon previous limitations that haven't been around for a while. For this particular problem, we have significant (if not full) control over the radar code, the stealth code, the weapon code, the "defense object" code (which provides health/armor), and much more. It's always better if we don't abuse existing gameplay systems. When we've done it before the features turned out very hacky, buggy, laggy, and overall not a good experience.
  23. well you can't attach scripts to warheads that's one drawback of the engine The biggest drawback to this engine is people believing they understand the drawbacks of this engine and designing gameplay around those perceived drawbacks which probably aren't really drawbacks or limitations at all.
×
×
  • Create New...