Cache to speedup loading multiple instances of the same scene URL

New TCastleSceneCore.Cache property allows to load the scene contents through a shared cache. If you have multiple TCastleScene loaded from the same URL, this is a sure way to speedup loading of them.

Testcase: examples/viewport_and_scenes/occlusion_culling, where I set up multiple buildings and creatures from the same glTF.

Alternative: You can actually achieve even more optimization (loading time, memory use) using TCastleTransformReference or (equivalent internally) just adding the same TCastleScene instance multiple times to the viewport. These techniques allow to really have one instance and just draw it multiple times — this is a much more powerful optimization, but also puts more constraints: all the referenced things must have the same animation, occlusion culling doesn’t work on them etc.

In contrast, using TCastleSceneCore.Cache is much safer, it doesn’t really change how things work at all from the developer perspective. Each scene may be in a different state, play different animation etc. We just internally load them faster. And it is simpler to use, just toggle the Cache checkbox and observe the speed boost.

Related: Do you want to measure the loading speed, e.g. to compare what do you gain by cache? See the advise about “profiling” in our manual. To cut the long story short, I advise to do Profiler.Enabled := true in your initialization, and observe the report about what takes time at initialization in the log. For more about TCastleProfiler, see the API docs.

How this cache works now, and what is the future: Right now, it works by internally storing the nodes graph in the cache, and instead of loading a file again — we can just do TX3DNode.DeepCopy to clone the graph. This is nice, but actually in the future it could be even better: if we change how our animations work to do not modify the graph, then actually all cached scenes could just use one and the same graph. This is a possible future optimization.

If you like this, please support us on Patreon!

1 Like