Compare commits

..

2 Commits

Author SHA1 Message Date
kaosat.dev 9d30d18416 feat(Blenvy:Blender): fixed a number of remaining issues with project serialization & attempted to fix scene rename detection
* moved out collections serialization from scenes loop
 * fixed issue with materials hashing
 * fixed issue with custom properties hashing
 * fixed issue with scene properties hashing
 * minor related tweaks
 * still pulling my hair out over weirdness with scene rename detection and handling
2024-07-15 01:46:27 +02:00
kaosat.dev 1059858363 feat(Blenvy:Blender): overhauled & upgraded project serialization & diffing
* now also outputing seperate collections hash & materials hash from serialize_project
 * changed project_diff to do diffing of materials & collections
 * hooked up output data to export logic
 * related tweaks & improvements
2024-07-11 14:41:15 +02:00
10 changed files with 219 additions and 89 deletions

View File

@ -153,7 +153,16 @@ Blender side:
- [x] filter out xxx_ui propgroups - [x] filter out xxx_ui propgroups
- [x] fix missing main/lib scene names in blenvy_common_settings - [x] fix missing main/lib scene names in blenvy_common_settings
- [x] fix incorect updating of main/lib scenes list in settings - [x] fix incorect updating of main/lib scenes list in settings
- [ ] and what about scene renames ?? perhaps tigger a forced "save settings" before doing the export ? - [ ] add handling of scene renames
- [x] store (on load) a mapping of scene objects to scene names
- [x] on save, calculate another mapping of scene objects to scene names
- if there is a mismatch between the stored version & the new version for a given scene, it has been renamed !
- [x] pass this information to scene diffing to remap old/new scene names
- [ ] move the rename detection to AFTER scene serialization, otherwise we could have a naming mistmatch
- weird behaviour, perhaps find another way , ie for example replace scene name in saved previous data
- is post save causing the issue ? review
- [ ] investigate weird issue of changes detected to all after a reload
- [x] should we write the previous _xxx data only AFTER a sucessfull export only ? - [x] should we write the previous _xxx data only AFTER a sucessfull export only ?
- [x] finer grained control of setting changes to trigger a re-export: - [x] finer grained control of setting changes to trigger a re-export:
- [x] common: any of them should trigger - [x] common: any of them should trigger
@ -187,9 +196,11 @@ Blender side:
- [x] fix selection logic - [x] fix selection logic
- [x] update testing blend files - [x] update testing blend files
- [x] disable 'export_hierarchy_full_collections' for all cases: not reliable and redudant - [x] disable 'export_hierarchy_full_collections' for all cases: not reliable and redudant
- [ ] fix systematic material exports despite no changes - [x] fix systematic material exports despite no changes
- [ ] investigate lack of detection of changes of adding/changing components - [x] investigate lack of detection of changes of adding/changing components
- [ ] change scene serialization to account for collections ...sigh - [x] change scene serialization to account for collections ...sigh
- [x] also add one NOT PER scene for materials, to fix the above issue with materials
- [x] move material caching into hash material
- [ ] also remove ____dummy____.bin when export format is gltf - [ ] also remove ____dummy____.bin when export format is gltf
- [ ] fix/cleanup asset information injection (also needed for hot reload) - [ ] fix/cleanup asset information injection (also needed for hot reload)
@ -224,7 +235,6 @@ Blender side:
- [ ] inject_export_path_into_internal_blueprints should be called on every asset/blueprint scan !! Not just on export - [ ] inject_export_path_into_internal_blueprints should be called on every asset/blueprint scan !! Not just on export
- [ ] undo after a save removes any saved "serialized scene" data ? DIG into this - [ ] undo after a save removes any saved "serialized scene" data ? DIG into this
- [ ] handle scene renames between saves (breaks diffing) => very hard to achieve
- [ ] add tests for - [ ] add tests for
- [ ] disabled components - [ ] disabled components
- [ ] blueprint instances as children of blueprint instances - [ ] blueprint instances as children of blueprint instances
@ -269,11 +279,13 @@ Bevy Side:
- [x] account for changes impact both parent & children (ie "world" and "blueprint3") for example, which leads to a crash as there is double despawn /respawn so we need to filter things out - [x] account for changes impact both parent & children (ie "world" and "blueprint3") for example, which leads to a crash as there is double despawn /respawn so we need to filter things out
- [x] if there are many assets/blueprints that have changed at the same time, it causes issues similar to the above, so apply a similar fix - [x] if there are many assets/blueprints that have changed at the same time, it causes issues similar to the above, so apply a similar fix
- [x] also ignore any entities currently spawning (better to loose some information, than cause a crash) - [x] also ignore any entities currently spawning (better to loose some information, than cause a crash)
- [ ] something is off with blueprint level components
- [ ] add the root blueprint itself to the assets either on the blender side or on the bevy side programatically
- [x] for sub blueprint tracking: do not propagate/ deal with parent blueprints if they are not themselves Spawning (ie filter out by "BlueprintSpawning") - [x] for sub blueprint tracking: do not propagate/ deal with parent blueprints if they are not themselves Spawning (ie filter out by "BlueprintSpawning")
- [ ] invalidate despawned entity & parent entities AABB
- [x] cleanup internals - [x] cleanup internals
- [ ] analyse what is off with blueprint level components
- [ ] add the root blueprint itself to the assets either on the blender side or on the bevy side programatically
- [ ] invalidate despawned entity & parent entities AABB
- [ ] add unloading/cache removal of materials
- [x] review & change general component insertion & spawning ordering & logic - [x] review & change general component insertion & spawning ordering & logic

View File

@ -152,6 +152,28 @@ def register():
bpy.app.handlers.depsgraph_update_post.append(post_update) bpy.app.handlers.depsgraph_update_post.append(post_update)
bpy.app.handlers.save_post.append(post_save) bpy.app.handlers.save_post.append(post_save)
""" handle = object()
subscribe_to = bpy.types.Scene, "name" #
def notify_test(context):
#if (context.scene.type == 'MESH'):
print("Renamed", dir(context), context.scenes)
bpy.msgbus.subscribe_rna(
key=subscribe_to,
owner=bpy,
args=(bpy.context,),
notify=notify_test,
)"""
#bpy.msgbus.publish_rna(key=subscribe_to)
def unregister(): def unregister():
for cls in classes: for cls in classes:
bpy.utils.unregister_class(cls) bpy.utils.unregister_class(cls)

View File

@ -6,7 +6,7 @@ def is_blueprint_always_export(blueprint):
return blueprint.collection['always_export'] if 'always_export' in blueprint.collection else False return blueprint.collection['always_export'] if 'always_export' in blueprint.collection else False
# this also takes the split/embed mode into account: if a nested collection changes AND embed is active, its container collection should also be exported # this also takes the split/embed mode into account: if a nested collection changes AND embed is active, its container collection should also be exported
def get_blueprints_to_export(changes_per_scene, changed_export_parameters, blueprints_data, settings): def get_blueprints_to_export(changes_per_scene, changes_per_collection, changed_export_parameters, blueprints_data, settings):
export_gltf_extension = getattr(settings, "export_gltf_extension", ".glb") export_gltf_extension = getattr(settings, "export_gltf_extension", ".glb")
blueprints_path_full = getattr(settings,"blueprints_path_full", "") blueprints_path_full = getattr(settings,"blueprints_path_full", "")
change_detection = getattr(settings.auto_export, "change_detection") change_detection = getattr(settings.auto_export, "change_detection")
@ -37,8 +37,9 @@ def get_blueprints_to_export(changes_per_scene, changed_export_parameters, bluep
# also deal with blueprints that are always marked as "always_export" # also deal with blueprints that are always marked as "always_export"
blueprints_always_export = [blueprint for blueprint in internal_blueprints if is_blueprint_always_export(blueprint)] blueprints_always_export = [blueprint for blueprint in internal_blueprints if is_blueprint_always_export(blueprint)]
changed_blueprints_based_on_changed_collections = [blueprint for blueprint in internal_blueprints if blueprint.collection in changes_per_collection.values()]
blueprints_to_export = list(set(changed_blueprints + blueprints_not_on_disk + blueprints_always_export))
blueprints_to_export = list(set(changed_blueprints + blueprints_not_on_disk + blueprints_always_export + changed_blueprints_based_on_changed_collections))
# filter out blueprints that are not marked & deal with the different combine modes # filter out blueprints that are not marked & deal with the different combine modes

View File

@ -17,7 +17,7 @@ from ..levels.bevy_scene_components import remove_scene_components, upsert_scene
"""this is the main 'central' function for all auto export """ """this is the main 'central' function for all auto export """
def auto_export(changes_per_scene, changed_export_parameters, settings): def auto_export(changes_per_scene, changes_per_collection, changes_per_material, changed_export_parameters, settings):
# have the export parameters (not auto export, just gltf export) have changed: if yes (for example switch from glb to gltf, compression or not, animations or not etc), we need to re-export everything # have the export parameters (not auto export, just gltf export) have changed: if yes (for example switch from glb to gltf, compression or not, animations or not etc), we need to re-export everything
print ("changed_export_parameters", changed_export_parameters) print ("changed_export_parameters", changed_export_parameters)
try: try:
@ -63,15 +63,15 @@ def auto_export(changes_per_scene, changed_export_parameters, settings):
if do_export_blueprints: if do_export_blueprints:
print("EXPORTING") print("EXPORTING")
# get blueprints/collections infos # get blueprints/collections infos
(blueprints_to_export) = get_blueprints_to_export(changes_per_scene, changed_export_parameters, blueprints_data, settings) (blueprints_to_export) = get_blueprints_to_export(changes_per_scene, changes_per_collection, changed_export_parameters, blueprints_data, settings)
# get level/main scenes infos # get level/main scenes infos
(main_scenes_to_export) = get_levels_to_export(changes_per_scene, changed_export_parameters, blueprints_data, settings) (main_scenes_to_export) = get_levels_to_export(changes_per_scene, changes_per_collection, changed_export_parameters, blueprints_data, settings)
# since materials export adds components we need to call this before blueprints are exported # since materials export adds components we need to call this before blueprints are exported
# export materials & inject materials components into relevant objects # export materials & inject materials components into relevant objects
# FIXME: improve change detection, perhaps even add "material changes" # FIXME: improve change detection, perhaps even add "material changes"
if export_materials_library and (changed_export_parameters or len(changes_per_scene.keys()) > 0 ): if export_materials_library and (changed_export_parameters or len(changes_per_material.keys()) > 0 ):
export_materials(blueprints_data.blueprint_names, settings.library_scenes, settings) export_materials(blueprints_data.blueprint_names, settings.library_scenes, settings)
# update the list of tracked exports # update the list of tracked exports

View File

@ -13,20 +13,25 @@ def prepare_and_export():
auto_export_settings = blenvy.auto_export auto_export_settings = blenvy.auto_export
if auto_export_settings.auto_export: # only do the actual exporting if auto export is actually enabled if auto_export_settings.auto_export: # only do the actual exporting if auto export is actually enabled
# determine changed objects # determine changed objects
per_scene_changes, project_hash = get_changes_per_scene(settings=blenvy) per_scene_changes, per_collection_changes, per_material_changes, project_hash = get_changes_per_scene(settings=blenvy)
# determine changed parameters # determine changed parameters
setting_changes, current_common_settings, current_export_settings, current_gltf_settings = get_setting_changes() setting_changes, current_common_settings, current_export_settings, current_gltf_settings = get_setting_changes()
print("changes: settings:", setting_changes) print("changes: settings:", setting_changes)
print("changes: scenes:", per_scene_changes) print("changes: scenes:", per_scene_changes)
print("changes: collections:", per_collection_changes)
print("changes: materials:", per_material_changes)
print("project_hash", project_hash)
# do the actual export # do the actual export
# blenvy.auto_export.dry_run = 'NO_EXPORT'#'DISABLED'# # blenvy.auto_export.dry_run = 'NO_EXPORT'#'DISABLED'#
auto_export(per_scene_changes, setting_changes, blenvy) auto_export(per_scene_changes, per_collection_changes, per_material_changes, setting_changes, blenvy)
# ------------------------------------- # -------------------------------------
# now that this point is reached, the export should have run correctly, so we can save all the current state to the "previous one" # now that this point is reached, the export should have run correctly, so we can save all the current state to the "previous one"
for scene in bpy.data.scenes:
blenvy.scenes_to_scene_names[scene] = scene.name
print("bla", blenvy.scenes_to_scene_names, "hash", project_hash)
# save the current project hash as previous # save the current project hash as previous
upsert_settings(".blenvy.project_serialized_previous", project_hash, overwrite=True) upsert_settings(".blenvy.project_serialized_previous", project_hash, overwrite=True)
# write the new settings to the old settings # write the new settings to the old settings

View File

@ -1,6 +1,7 @@
import json import json
import traceback
import bpy import bpy
from .serialize_scene import serialize_scene from .serialize_project import serialize_project
from blenvy.settings import load_settings, upsert_settings from blenvy.settings import load_settings, upsert_settings
def bubble_up_changes(object, changes_per_scene): def bubble_up_changes(object, changes_per_scene):
@ -18,6 +19,7 @@ def serialize_current(settings):
print("GENERATE ID") print("GENERATE ID")
scene.id_test = str(uuid.uuid4()) scene.id_test = str(uuid.uuid4())
print("SCENE ID", scene.id_test) print("SCENE ID", scene.id_test)
#https://blender.stackexchange.com/questions/216411/whats-the-replacement-for-id-or-hash-on-bpy-objects
current_scene = bpy.context.window.scene current_scene = bpy.context.window.scene
bpy.context.window.scene = bpy.data.scenes[0] bpy.context.window.scene = bpy.data.scenes[0]
@ -26,7 +28,7 @@ def serialize_current(settings):
"""with bpy.context.temp_override(scene=bpy.data.scenes[1]): """with bpy.context.temp_override(scene=bpy.data.scenes[1]):
bpy.context.scene.frame_set(0)""" bpy.context.scene.frame_set(0)"""
current = serialize_scene(settings) current = serialize_project(settings)
bpy.context.window.scene = current_scene bpy.context.window.scene = current_scene
# reset previous frames # reset previous frames
@ -39,62 +41,125 @@ def get_changes_per_scene(settings):
previous = load_settings(".blenvy.project_serialized_previous") previous = load_settings(".blenvy.project_serialized_previous")
current = serialize_current(settings) current = serialize_current(settings)
# so in Blender, there is no uuid per object, hash changes on undo redo, adress/pointer to object may change at any undo / redo without possible way of knowing when
# so... ugh
scenes_to_scene_names = {}
for scene in bpy.data.scenes:
scenes_to_scene_names[scene] = scene.name
print("cur scenes_to_scene_names", scenes_to_scene_names)
print("pre fpp", settings.scenes_to_scene_names)
scene_renames = {}
for scene in settings.scenes_to_scene_names:
if scene in scenes_to_scene_names:
previous_name_of_scene = settings.scenes_to_scene_names[scene]
current_name_of_scene = scenes_to_scene_names[scene]
if previous_name_of_scene != current_name_of_scene:
scene_renames[current_name_of_scene] = previous_name_of_scene
print("SCENE RENAMED !previous", previous_name_of_scene, "current", current_name_of_scene)
print("scene new name to old name", scene_renames)
# determine changes # determine changes
changes_per_scene = {} changes_per_scene = {}
changes_per_collection = {}
changes_per_material = {}
try: try:
changes_per_scene = project_diff(previous, current, settings) (changes_per_scene, changes_per_collection, changes_per_material) = project_diff(previous, current, scene_renames, settings)
except Exception as error: except Exception as error:
print("failed to compare current serialized scenes to previous ones", error) print(traceback.format_exc())
print("failed to compare current serialized scenes to previous ones: Error:", error)
return changes_per_scene, current return changes_per_scene, changes_per_collection, changes_per_material, current
def project_diff(previous, current, settings): def project_diff(previous, current, scene_renames, settings):
"""print("previous", previous) """print("previous", previous)
print("current", current)""" print("current", current)"""
if previous is None or current is None: if previous is None or current is None:
return {} return {}
changes_per_scene = {} changes_per_scene = {}
changes_per_collection = {}
changes_per_material = {}
# TODO : how do we deal with changed scene names ???
# possible ? on each save, inject an id into each scene, that cannot be copied over # possible ? on each save, inject an id into each scene, that cannot be copied over
current_scenes = current["scenes"]
previous_scenes = previous["scenes"]
for scene in current: print("previous scenes", previous_scenes.keys())
current_object_names =list(current[scene].keys()) print("current scenes", current_scenes.keys())
print("new names to old names", scene_renames)
print("")
for scene_name in current_scenes:
print("scene name", scene_name, scene_name in scene_renames)
current_scene = current_scenes[scene_name]
previous_scene = previous_scenes[scene_name] if not scene_name in scene_renames else previous_scenes[scene_renames[scene_name]]
current_object_names =list(current_scene.keys())
if scene in previous: # we can only compare scenes that are in both previous and current data updated_scene_name = scene_name if not scene_name in scene_renames else scene_renames[scene_name]
if updated_scene_name in previous_scenes: # we can only compare scenes that are in both previous and current data, with the above we also account for renames
previous_object_names = list(previous[scene].keys()) previous_object_names = list(previous_scene.keys())
added = list(set(current_object_names) - set(previous_object_names)) added = list(set(current_object_names) - set(previous_object_names))
removed = list(set(previous_object_names) - set(current_object_names)) removed = list(set(previous_object_names) - set(current_object_names))
for obj in added: for obj in added:
if not scene in changes_per_scene: if not scene_name in changes_per_scene:
changes_per_scene[scene] = {} changes_per_scene[scene_name] = {}
changes_per_scene[scene][obj] = bpy.data.objects[obj] if obj in bpy.data.objects else None changes_per_scene[scene_name][obj] = bpy.data.objects[obj] if obj in bpy.data.objects else None
# TODO: how do we deal with this, as we obviously do not have data for removed objects ? # TODO: how do we deal with this, as we obviously do not have data for removed objects ?
for obj in removed: for obj in removed:
if not scene in changes_per_scene: if not scene_name in changes_per_scene:
changes_per_scene[scene] = {} changes_per_scene[scene_name] = {}
changes_per_scene[scene][obj] = None changes_per_scene[scene_name][obj] = None
for object_name in list(current[scene].keys()): # TODO : exclude directly added/removed objects for object_name in list(current_scene.keys()): # TODO : exclude directly added/removed objects
if object_name in previous[scene]: if object_name in previous_scene:
current_obj = current[scene][object_name] current_obj = current_scene[object_name]
prev_obj = previous[scene][object_name] prev_obj = previous_scene[object_name]
same = str(current_obj) == str(prev_obj) same = str(current_obj) == str(prev_obj)
if not same: if not same:
if not scene in changes_per_scene: if not scene_name in changes_per_scene:
changes_per_scene[scene] = {} changes_per_scene[scene_name] = {}
target_object = bpy.data.objects[object_name] if object_name in bpy.data.objects else None target_object = bpy.data.objects[object_name] if object_name in bpy.data.objects else None
changes_per_scene[scene][object_name] = target_object changes_per_scene[scene_name][object_name] = target_object
bubble_up_changes(target_object, changes_per_scene[scene]) bubble_up_changes(target_object, changes_per_scene[scene_name])
# now bubble up for instances & parents # now bubble up for instances & parents
else: else:
print(f"scene {scene} not present in previous data") print(f"scene {scene_name} not present in previous data")
current_collections = current["collections"]
previous_collections = previous["collections"]
for collection_name in current_collections:
if collection_name in previous_collections:
current_collection = current_collections[collection_name]
prev_collection = previous_collections[collection_name]
same = str(current_collection) == str(prev_collection)
if not same:
#if not collection_name in changes_per_collection:
target_collection = bpy.data.collections[collection_name] if collection_name in bpy.data.collections else None
changes_per_collection[collection_name] = target_collection
# process changes to materials
current_materials = current["materials"]
previous_materials = previous["materials"]
for material_name in current_materials:
if material_name in previous_materials:
current_material = current_materials[material_name]
prev_material = previous_materials[material_name]
same = str(current_material) == str(prev_material)
if not same:
#if not material_name in changes_per_material:
target_material = bpy.data.materials[material_name] if material_name in bpy.data.materials else None
changes_per_material[material_name] = target_material
return changes_per_scene return (changes_per_scene, changes_per_collection, changes_per_material)

View File

@ -126,7 +126,10 @@ type_lookups = {
bpy.types.bpy_prop_collection: _lookup_collection, bpy.types.bpy_prop_collection: _lookup_collection,
bpy.types.MaterialLineArt: _lookup_materialLineArt, bpy.types.MaterialLineArt: _lookup_materialLineArt,
bpy.types.NodeTree: node_tree, bpy.types.NodeTree: node_tree,
bpy.types.CurveProfile: _lookup_generic bpy.types.CurveProfile: _lookup_generic,
bpy.types.RaytraceEEVEE: _lookup_generic,
bpy.types.CurveMapping: _lookup_generic,
bpy.types.MaterialGPencilStyle: _lookup_generic,
} }
def convert_field(raw_value, field_name="", scan_node_tree=True): def convert_field(raw_value, field_name="", scan_node_tree=True):
@ -233,11 +236,12 @@ def animation_hash(obj):
# TODO: also how about our new "assets" custom properties ? those need to be check too # TODO: also how about our new "assets" custom properties ? those need to be check too
def custom_properties_hash(obj): def custom_properties_hash(obj):
custom_properties = {} custom_properties = {}
for property_name in obj.keys(): for property_name in obj.keys():
if property_name not in '_RNA_UI' and property_name != 'components_meta': if property_name not in '_RNA_UI' and property_name != 'components_meta' and property_name != 'user_assets':
print("custom properties stuff for", obj, property_name) custom_properties[property_name] = obj[property_name] #generic_fields_hasher_evolved(data=obj[property_name],fields_to_ignore=fields_to_ignore_generic)
custom_properties[property_name] = obj[property_name] """if property_name == "user_assets":
print("custom props for hashing", custom_properties, str(h1_hash(str(custom_properties))) ) print("tptp")
custom_properties[property_name] = generic_fields_hasher_evolved(data=obj[property_name],fields_to_ignore=fields_to_ignore_generic)"""
return str(h1_hash(str(custom_properties))) return str(h1_hash(str(custom_properties)))
def camera_hash(obj): def camera_hash(obj):
@ -273,12 +277,18 @@ def armature_hash(obj):
print("bone", bone, bone_hash(bone))""" print("bone", bone, bone_hash(bone))"""
return str(fields) return str(fields)
def material_hash(material, settings): def material_hash(material, cache, settings):
scan_node_tree = settings.auto_export.materials_in_depth_scan cached_hash = cache['materials'].get(material.name, None)
#print("HASHING MATERIAL", material.name) if cached_hash:
hashed_material = generic_fields_hasher_evolved(material, fields_to_ignore_generic, scan_node_tree=scan_node_tree) return cached_hash
#print("HASHED MATERIAL", hashed_material) else:
return str(hashed_material) scan_node_tree = settings.auto_export.materials_in_depth_scan
#print("HASHING MATERIAL", material.name)
hashed_material = generic_fields_hasher_evolved(material, fields_to_ignore_generic, scan_node_tree=scan_node_tree)
#print("HASHED MATERIAL", hashed_material)
hashed_material = str(hashed_material)
cache['materials'][material.name] = hashed_material
return hashed_material
# TODO: this is partially taken from export_materials utilities, perhaps we could avoid having to fetch things multiple times ? # TODO: this is partially taken from export_materials utilities, perhaps we could avoid having to fetch things multiple times ?
def materials_hash(obj, cache, settings): def materials_hash(obj, cache, settings):
@ -286,20 +296,12 @@ def materials_hash(obj, cache, settings):
materials = [] materials = []
for material_slot in obj.material_slots: for material_slot in obj.material_slots:
material = material_slot.material material = material_slot.material
"""cached_hash = cache['materials'].get(material.name, None) mat = material_hash(material, cache, settings)
if cached_hash:
materials.append(cached_hash)
print("CAACHED")
else:
mat = material_hash(material, settings)
cache['materials'][material.name] = mat
materials.append(mat)"""
mat = material_hash(material, settings)
cache['materials'][material.name] = mat
materials.append(mat) materials.append(mat)
return str(h1_hash(str(materials))) return str(h1_hash(str(materials)))
def modifier_hash(modifier_data, settings): def modifier_hash(modifier_data, settings):
scan_node_tree = settings.auto_export.modifiers_in_depth_scan scan_node_tree = settings.auto_export.modifiers_in_depth_scan
#print("HASHING MODIFIER", modifier_data.name) #print("HASHING MODIFIER", modifier_data.name)
@ -307,7 +309,6 @@ def modifier_hash(modifier_data, settings):
#print("modifier", modifier_data.name, "hashed", hashed_modifier) #print("modifier", modifier_data.name, "hashed", hashed_modifier)
return str(hashed_modifier) return str(hashed_modifier)
def modifiers_hash(object, settings): def modifiers_hash(object, settings):
modifiers = [] modifiers = []
for modifier in object.modifiers: for modifier in object.modifiers:
@ -316,24 +317,22 @@ def modifiers_hash(object, settings):
#print(" ") #print(" ")
return str(h1_hash(str(modifiers))) return str(h1_hash(str(modifiers)))
def serialize_scene(settings):
def serialize_project(settings):
cache = {"materials":{}} cache = {"materials":{}}
print("serializing scenes") print("serializing project")
data = {}
per_scene = {}
# render settings are injected into each scene for scene in settings.main_scenes + settings.library_scenes: #bpy.data.scenes:
# TODO: only go through scenes actually in our list
for scene in bpy.data.scenes:
print("scene", scene.name) print("scene", scene.name)
# ignore temporary scenes # ignore temporary scenes
if scene.name.startswith(TEMPSCENE_PREFIX): if scene.name.startswith(TEMPSCENE_PREFIX):
continue continue
data[scene.name] = {} per_scene[scene.name] = {}
custom_properties = custom_properties_hash(scene) if len(scene.keys()) > 0 else None custom_properties = custom_properties_hash(scene) if len(scene.keys()) > 0 else None
# render settings are injected into each scene
eevee_settings = generic_fields_hasher_evolved(scene.eevee, fields_to_ignore=fields_to_ignore_generic) # TODO: ignore most of the fields eevee_settings = generic_fields_hasher_evolved(scene.eevee, fields_to_ignore=fields_to_ignore_generic) # TODO: ignore most of the fields
view_settings = generic_fields_hasher_evolved(scene.view_settings, fields_to_ignore=fields_to_ignore_generic) view_settings = generic_fields_hasher_evolved(scene.view_settings, fields_to_ignore=fields_to_ignore_generic)
@ -344,8 +343,7 @@ def serialize_scene(settings):
} }
#generic_fields_hasher_evolved(scene.eevee, fields_to_ignore=fields_to_ignore_generic) #generic_fields_hasher_evolved(scene.eevee, fields_to_ignore=fields_to_ignore_generic)
# FIXME: how to deal with this cleanly # FIXME: how to deal with this cleanly
print("SCENE CUSTOM PROPS", custom_properties) per_scene[scene.name]["____scene_settings"] = str(h1_hash(str(scene_field_hashes)))
data[scene.name]["____scene_settings"] = str(h1_hash(str(scene_field_hashes)))
for object in scene.objects: for object in scene.objects:
@ -382,13 +380,37 @@ def serialize_scene(settings):
object_field_hashes_filtered = {key: object_field_hashes[key] for key in object_field_hashes.keys() if object_field_hashes[key] is not None} object_field_hashes_filtered = {key: object_field_hashes[key] for key in object_field_hashes.keys() if object_field_hashes[key] is not None}
objectHash = str(h1_hash(str(object_field_hashes_filtered))) objectHash = str(h1_hash(str(object_field_hashes_filtered)))
data[scene.name][object.name] = objectHash per_scene[scene.name][object.name] = objectHash
"""print("data", data) per_collection = {}
print("") # also hash collections (important to catch component changes per blueprints/collections)
print("") # collections_in_scene = [collection for collection in bpy.data.collections if scene.user_of_id(collection)]
print("data json", json.dumps(data))""" for collection in bpy.data.collections:# collections_in_scene:
#loc, rot, scale = bpy.context.object.matrix_world.decompose()
#visibility = collection.visible_get()
custom_properties = custom_properties_hash(collection) if len(collection.keys()) > 0 else None
# parent = collection.parent.name if collection.parent else None
#collections = [collection.name for collection in object.users_collection]
return data # json.dumps(data) collection_field_hashes = {
"name": collection.name,
# "visibility": visibility,
"custom_properties": custom_properties,
#"parent": parent,
#"collections": collections,
}
collection_field_hashes_filtered = {key: collection_field_hashes[key] for key in collection_field_hashes.keys() if collection_field_hashes[key] is not None}
collectionHash = str(h1_hash(str(collection_field_hashes_filtered)))
per_collection[collection.name] = collectionHash
# and also hash materials to avoid constanstly exporting materials libraries, and only
# actually this should be similar to change detections for scenes
per_material = {}
for material in bpy.data.materials:
per_material[material.name] = str(h1_hash(material_hash(material, cache, settings)))
return {"scenes": per_scene, "collections": per_collection, "materials": per_material}

View File

@ -55,7 +55,7 @@ def should_level_be_exported(scene_name, changed_export_parameters, changes_per_
) )
# this also takes the split/embed mode into account: if a collection instance changes AND embed is active, its container level/world should also be exported # this also takes the split/embed mode into account: if a collection instance changes AND embed is active, its container level/world should also be exported
def get_levels_to_export(changes_per_scene, changed_export_parameters, blueprints_data, settings): def get_levels_to_export(changes_per_scene, changes_per_collection, changed_export_parameters, blueprints_data, settings):
# determine list of main scenes to export # determine list of main scenes to export
# we have more relaxed rules to determine if the main scenes have changed : any change is ok, (allows easier handling of changes, render settings etc) # we have more relaxed rules to determine if the main scenes have changed : any change is ok, (allows easier handling of changes, render settings etc)
main_scenes_to_export = [scene_name for scene_name in settings.main_scenes_names if should_level_be_exported(scene_name, changed_export_parameters, changes_per_scene, blueprints_data, settings)] main_scenes_to_export = [scene_name for scene_name in settings.main_scenes_names if should_level_be_exported(scene_name, changed_export_parameters, changes_per_scene, blueprints_data, settings)]

View File

@ -33,7 +33,6 @@ def inject_export_path_into_internal_blueprints(internal_blueprints, blueprints_
for blueprint in internal_blueprints: for blueprint in internal_blueprints:
blueprint_exported_path = os.path.join(blueprints_path, f"{blueprint.name}{gltf_extension}") blueprint_exported_path = os.path.join(blueprints_path, f"{blueprint.name}{gltf_extension}")
# print("injecting blueprint path", blueprint_exported_path, "for", blueprint.name) # print("injecting blueprint path", blueprint_exported_path, "for", blueprint.name)
print("blueprint_exported_path", blueprint_exported_path)
blueprint.collection["export_path"] = blueprint_exported_path blueprint.collection["export_path"] = blueprint_exported_path
if export_materials_library: if export_materials_library:
blueprint.collection["materials_path"] = materials_exported_path blueprint.collection["materials_path"] = materials_exported_path

View File

@ -37,9 +37,9 @@ def is_scene_already_in_use(self, scene):
return True return True
class BlenvyManager(PropertyGroup): class BlenvyManager(PropertyGroup):
settings_save_path = ".blenvy_common_settings" # where to store data in bpy.texts settings_save_path = ".blenvy_common_settings" # where to store data in bpy.texts
settings_save_enabled: BoolProperty(name="settings save enabled", default=True) # type: ignore settings_save_enabled: BoolProperty(name="settings save enabled", default=True) # type: ignore
scenes_to_scene_names = {} # used to map scenes to scene names to detect scene renames for diffing
mode: EnumProperty( mode: EnumProperty(
items=( items=(
@ -187,3 +187,7 @@ class BlenvyManager(PropertyGroup):
# now load component settings # now load component settings
self.components.load_settings() self.components.load_settings()
for scene in bpy.data.scenes:
self.scenes_to_scene_names[scene] = scene.name