feat(Blenvy:Blender): fixed a number of remaining issues with project serialization & attempted to fix scene rename detection

* moved out collections serialization from scenes loop
 * fixed issue with materials hashing
 * fixed issue with custom properties hashing
 * fixed issue with scene properties hashing
 * minor related tweaks
 * still pulling my hair out over weirdness with scene rename detection and handling
This commit is contained in:
kaosat.dev 2024-07-15 01:37:13 +02:00
parent 1059858363
commit 9d30d18416
7 changed files with 148 additions and 92 deletions

View File

@ -153,7 +153,16 @@ Blender side:
- [x] filter out xxx_ui propgroups - [x] filter out xxx_ui propgroups
- [x] fix missing main/lib scene names in blenvy_common_settings - [x] fix missing main/lib scene names in blenvy_common_settings
- [x] fix incorect updating of main/lib scenes list in settings - [x] fix incorect updating of main/lib scenes list in settings
- [ ] and what about scene renames ?? perhaps tigger a forced "save settings" before doing the export ? - [ ] add handling of scene renames
- [x] store (on load) a mapping of scene objects to scene names
- [x] on save, calculate another mapping of scene objects to scene names
- if there is a mismatch between the stored version & the new version for a given scene, it has been renamed !
- [x] pass this information to scene diffing to remap old/new scene names
- [ ] move the rename detection to AFTER scene serialization, otherwise we could have a naming mistmatch
- weird behaviour, perhaps find another way , ie for example replace scene name in saved previous data
- is post save causing the issue ? review
- [ ] investigate weird issue of changes detected to all after a reload
- [x] should we write the previous _xxx data only AFTER a sucessfull export only ? - [x] should we write the previous _xxx data only AFTER a sucessfull export only ?
- [x] finer grained control of setting changes to trigger a re-export: - [x] finer grained control of setting changes to trigger a re-export:
- [x] common: any of them should trigger - [x] common: any of them should trigger
@ -187,11 +196,11 @@ Blender side:
- [x] fix selection logic - [x] fix selection logic
- [x] update testing blend files - [x] update testing blend files
- [x] disable 'export_hierarchy_full_collections' for all cases: not reliable and redudant - [x] disable 'export_hierarchy_full_collections' for all cases: not reliable and redudant
- [ ] fix systematic material exports despite no changes - [x] fix systematic material exports despite no changes
- [ ] investigate lack of detection of changes of adding/changing components - [x] investigate lack of detection of changes of adding/changing components
- [x] change scene serialization to account for collections ...sigh - [x] change scene serialization to account for collections ...sigh
- [x] also add one NOT PER scene for materials, to fix the above issue with materials - [x] also add one NOT PER scene for materials, to fix the above issue with materials
- [ ] move material caching into hash material - [x] move material caching into hash material
- [ ] also remove ____dummy____.bin when export format is gltf - [ ] also remove ____dummy____.bin when export format is gltf
- [ ] fix/cleanup asset information injection (also needed for hot reload) - [ ] fix/cleanup asset information injection (also needed for hot reload)
@ -226,7 +235,6 @@ Blender side:
- [ ] inject_export_path_into_internal_blueprints should be called on every asset/blueprint scan !! Not just on export - [ ] inject_export_path_into_internal_blueprints should be called on every asset/blueprint scan !! Not just on export
- [ ] undo after a save removes any saved "serialized scene" data ? DIG into this - [ ] undo after a save removes any saved "serialized scene" data ? DIG into this
- [ ] handle scene renames between saves (breaks diffing) => very hard to achieve
- [ ] add tests for - [ ] add tests for
- [ ] disabled components - [ ] disabled components
- [ ] blueprint instances as children of blueprint instances - [ ] blueprint instances as children of blueprint instances
@ -271,11 +279,13 @@ Bevy Side:
- [x] account for changes impact both parent & children (ie "world" and "blueprint3") for example, which leads to a crash as there is double despawn /respawn so we need to filter things out - [x] account for changes impact both parent & children (ie "world" and "blueprint3") for example, which leads to a crash as there is double despawn /respawn so we need to filter things out
- [x] if there are many assets/blueprints that have changed at the same time, it causes issues similar to the above, so apply a similar fix - [x] if there are many assets/blueprints that have changed at the same time, it causes issues similar to the above, so apply a similar fix
- [x] also ignore any entities currently spawning (better to loose some information, than cause a crash) - [x] also ignore any entities currently spawning (better to loose some information, than cause a crash)
- [x] for sub blueprint tracking: do not propagate/ deal with parent blueprints if they are not themselves Spawning (ie filter out by "BlueprintSpawning")
- [x] cleanup internals
- [ ] analyse what is off with blueprint level components - [ ] analyse what is off with blueprint level components
- [ ] add the root blueprint itself to the assets either on the blender side or on the bevy side programatically - [ ] add the root blueprint itself to the assets either on the blender side or on the bevy side programatically
- [x] for sub blueprint tracking: do not propagate/ deal with parent blueprints if they are not themselves Spawning (ie filter out by "BlueprintSpawning")
- [ ] invalidate despawned entity & parent entities AABB - [ ] invalidate despawned entity & parent entities AABB
- [x] cleanup internals - [ ] add unloading/cache removal of materials
- [x] review & change general component insertion & spawning ordering & logic - [x] review & change general component insertion & spawning ordering & logic

View File

@ -152,6 +152,28 @@ def register():
bpy.app.handlers.depsgraph_update_post.append(post_update) bpy.app.handlers.depsgraph_update_post.append(post_update)
bpy.app.handlers.save_post.append(post_save) bpy.app.handlers.save_post.append(post_save)
""" handle = object()
subscribe_to = bpy.types.Scene, "name" #
def notify_test(context):
#if (context.scene.type == 'MESH'):
print("Renamed", dir(context), context.scenes)
bpy.msgbus.subscribe_rna(
key=subscribe_to,
owner=bpy,
args=(bpy.context,),
notify=notify_test,
)"""
#bpy.msgbus.publish_rna(key=subscribe_to)
def unregister(): def unregister():
for cls in classes: for cls in classes:
bpy.utils.unregister_class(cls) bpy.utils.unregister_class(cls)

View File

@ -13,6 +13,7 @@ def prepare_and_export():
auto_export_settings = blenvy.auto_export auto_export_settings = blenvy.auto_export
if auto_export_settings.auto_export: # only do the actual exporting if auto export is actually enabled if auto_export_settings.auto_export: # only do the actual exporting if auto export is actually enabled
# determine changed objects # determine changed objects
per_scene_changes, per_collection_changes, per_material_changes, project_hash = get_changes_per_scene(settings=blenvy) per_scene_changes, per_collection_changes, per_material_changes, project_hash = get_changes_per_scene(settings=blenvy)
# determine changed parameters # determine changed parameters
@ -22,14 +23,15 @@ def prepare_and_export():
print("changes: collections:", per_collection_changes) print("changes: collections:", per_collection_changes)
print("changes: materials:", per_material_changes) print("changes: materials:", per_material_changes)
print("project_hash", project_hash)
# do the actual export # do the actual export
# blenvy.auto_export.dry_run = 'NO_EXPORT'#'DISABLED'# # blenvy.auto_export.dry_run = 'NO_EXPORT'#'DISABLED'#
auto_export(per_scene_changes, per_collection_changes, per_material_changes, setting_changes, blenvy) auto_export(per_scene_changes, per_collection_changes, per_material_changes, setting_changes, blenvy)
# ------------------------------------- # -------------------------------------
# now that this point is reached, the export should have run correctly, so we can save all the current state to the "previous one" # now that this point is reached, the export should have run correctly, so we can save all the current state to the "previous one"
for scene in bpy.data.scenes:
blenvy.scenes_to_scene_names[scene] = scene.name
print("bla", blenvy.scenes_to_scene_names, "hash", project_hash)
# save the current project hash as previous # save the current project hash as previous
upsert_settings(".blenvy.project_serialized_previous", project_hash, overwrite=True) upsert_settings(".blenvy.project_serialized_previous", project_hash, overwrite=True)
# write the new settings to the old settings # write the new settings to the old settings

View File

@ -1,4 +1,5 @@
import json import json
import traceback
import bpy import bpy
from .serialize_project import serialize_project from .serialize_project import serialize_project
from blenvy.settings import load_settings, upsert_settings from blenvy.settings import load_settings, upsert_settings
@ -18,6 +19,7 @@ def serialize_current(settings):
print("GENERATE ID") print("GENERATE ID")
scene.id_test = str(uuid.uuid4()) scene.id_test = str(uuid.uuid4())
print("SCENE ID", scene.id_test) print("SCENE ID", scene.id_test)
#https://blender.stackexchange.com/questions/216411/whats-the-replacement-for-id-or-hash-on-bpy-objects
current_scene = bpy.context.window.scene current_scene = bpy.context.window.scene
bpy.context.window.scene = bpy.data.scenes[0] bpy.context.window.scene = bpy.data.scenes[0]
@ -39,19 +41,38 @@ def get_changes_per_scene(settings):
previous = load_settings(".blenvy.project_serialized_previous") previous = load_settings(".blenvy.project_serialized_previous")
current = serialize_current(settings) current = serialize_current(settings)
# so in Blender, there is no uuid per object, hash changes on undo redo, adress/pointer to object may change at any undo / redo without possible way of knowing when
# so... ugh
scenes_to_scene_names = {}
for scene in bpy.data.scenes:
scenes_to_scene_names[scene] = scene.name
print("cur scenes_to_scene_names", scenes_to_scene_names)
print("pre fpp", settings.scenes_to_scene_names)
scene_renames = {}
for scene in settings.scenes_to_scene_names:
if scene in scenes_to_scene_names:
previous_name_of_scene = settings.scenes_to_scene_names[scene]
current_name_of_scene = scenes_to_scene_names[scene]
if previous_name_of_scene != current_name_of_scene:
scene_renames[current_name_of_scene] = previous_name_of_scene
print("SCENE RENAMED !previous", previous_name_of_scene, "current", current_name_of_scene)
print("scene new name to old name", scene_renames)
# determine changes # determine changes
changes_per_scene = {} changes_per_scene = {}
changes_per_collection = {} changes_per_collection = {}
changes_per_material = {} changes_per_material = {}
try: try:
(changes_per_scene, changes_per_collection, changes_per_material) = project_diff(previous, current, settings) (changes_per_scene, changes_per_collection, changes_per_material) = project_diff(previous, current, scene_renames, settings)
except Exception as error: except Exception as error:
print(traceback.format_exc())
print("failed to compare current serialized scenes to previous ones: Error:", error) print("failed to compare current serialized scenes to previous ones: Error:", error)
return changes_per_scene, changes_per_collection, changes_per_material, current return changes_per_scene, changes_per_collection, changes_per_material, current
def project_diff(previous, current, settings): def project_diff(previous, current, scene_renames, settings):
"""print("previous", previous) """print("previous", previous)
print("current", current)""" print("current", current)"""
if previous is None or current is None: if previous is None or current is None:
@ -61,46 +82,54 @@ def project_diff(previous, current, settings):
changes_per_collection = {} changes_per_collection = {}
changes_per_material = {} changes_per_material = {}
# TODO : how do we deal with changed scene names ???
# possible ? on each save, inject an id into each scene, that cannot be copied over # possible ? on each save, inject an id into each scene, that cannot be copied over
current_scenes = current["scenes"] current_scenes = current["scenes"]
previous_scenes = previous["scenes"] previous_scenes = previous["scenes"]
for scene in current_scenes:
current_object_names =list(current_scenes[scene].keys())
if scene in previous_scenes: # we can only compare scenes that are in both previous and current data print("previous scenes", previous_scenes.keys())
print("current scenes", current_scenes.keys())
print("new names to old names", scene_renames)
print("")
for scene_name in current_scenes:
print("scene name", scene_name, scene_name in scene_renames)
current_scene = current_scenes[scene_name]
previous_scene = previous_scenes[scene_name] if not scene_name in scene_renames else previous_scenes[scene_renames[scene_name]]
current_object_names =list(current_scene.keys())
previous_object_names = list(previous_scenes[scene].keys()) updated_scene_name = scene_name if not scene_name in scene_renames else scene_renames[scene_name]
if updated_scene_name in previous_scenes: # we can only compare scenes that are in both previous and current data, with the above we also account for renames
previous_object_names = list(previous_scene.keys())
added = list(set(current_object_names) - set(previous_object_names)) added = list(set(current_object_names) - set(previous_object_names))
removed = list(set(previous_object_names) - set(current_object_names)) removed = list(set(previous_object_names) - set(current_object_names))
for obj in added: for obj in added:
if not scene in changes_per_scene: if not scene_name in changes_per_scene:
changes_per_scene[scene] = {} changes_per_scene[scene_name] = {}
changes_per_scene[scene][obj] = bpy.data.objects[obj] if obj in bpy.data.objects else None changes_per_scene[scene_name][obj] = bpy.data.objects[obj] if obj in bpy.data.objects else None
# TODO: how do we deal with this, as we obviously do not have data for removed objects ? # TODO: how do we deal with this, as we obviously do not have data for removed objects ?
for obj in removed: for obj in removed:
if not scene in changes_per_scene: if not scene_name in changes_per_scene:
changes_per_scene[scene] = {} changes_per_scene[scene_name] = {}
changes_per_scene[scene][obj] = None changes_per_scene[scene_name][obj] = None
for object_name in list(current_scenes[scene].keys()): # TODO : exclude directly added/removed objects for object_name in list(current_scene.keys()): # TODO : exclude directly added/removed objects
if object_name in previous_scenes[scene]: if object_name in previous_scene:
current_obj = current_scenes[scene][object_name] current_obj = current_scene[object_name]
prev_obj = previous_scenes[scene][object_name] prev_obj = previous_scene[object_name]
same = str(current_obj) == str(prev_obj) same = str(current_obj) == str(prev_obj)
if not same: if not same:
if not scene in changes_per_scene: if not scene_name in changes_per_scene:
changes_per_scene[scene] = {} changes_per_scene[scene_name] = {}
target_object = bpy.data.objects[object_name] if object_name in bpy.data.objects else None target_object = bpy.data.objects[object_name] if object_name in bpy.data.objects else None
changes_per_scene[scene][object_name] = target_object changes_per_scene[scene_name][object_name] = target_object
bubble_up_changes(target_object, changes_per_scene[scene]) bubble_up_changes(target_object, changes_per_scene[scene_name])
# now bubble up for instances & parents # now bubble up for instances & parents
else: else:
print(f"scene {scene} not present in previous data") print(f"scene {scene_name} not present in previous data")

View File

@ -126,7 +126,10 @@ type_lookups = {
bpy.types.bpy_prop_collection: _lookup_collection, bpy.types.bpy_prop_collection: _lookup_collection,
bpy.types.MaterialLineArt: _lookup_materialLineArt, bpy.types.MaterialLineArt: _lookup_materialLineArt,
bpy.types.NodeTree: node_tree, bpy.types.NodeTree: node_tree,
bpy.types.CurveProfile: _lookup_generic bpy.types.CurveProfile: _lookup_generic,
bpy.types.RaytraceEEVEE: _lookup_generic,
bpy.types.CurveMapping: _lookup_generic,
bpy.types.MaterialGPencilStyle: _lookup_generic,
} }
def convert_field(raw_value, field_name="", scan_node_tree=True): def convert_field(raw_value, field_name="", scan_node_tree=True):
@ -234,10 +237,11 @@ def animation_hash(obj):
def custom_properties_hash(obj): def custom_properties_hash(obj):
custom_properties = {} custom_properties = {}
for property_name in obj.keys(): for property_name in obj.keys():
if property_name not in '_RNA_UI' and property_name != 'components_meta': if property_name not in '_RNA_UI' and property_name != 'components_meta' and property_name != 'user_assets':
print("custom properties stuff for", obj, property_name) custom_properties[property_name] = obj[property_name] #generic_fields_hasher_evolved(data=obj[property_name],fields_to_ignore=fields_to_ignore_generic)
custom_properties[property_name] = obj[property_name] """if property_name == "user_assets":
print("custom props for hashing", custom_properties, str(h1_hash(str(custom_properties))) ) print("tptp")
custom_properties[property_name] = generic_fields_hasher_evolved(data=obj[property_name],fields_to_ignore=fields_to_ignore_generic)"""
return str(h1_hash(str(custom_properties))) return str(h1_hash(str(custom_properties)))
def camera_hash(obj): def camera_hash(obj):
@ -273,12 +277,18 @@ def armature_hash(obj):
print("bone", bone, bone_hash(bone))""" print("bone", bone, bone_hash(bone))"""
return str(fields) return str(fields)
def material_hash(material, settings): def material_hash(material, cache, settings):
scan_node_tree = settings.auto_export.materials_in_depth_scan cached_hash = cache['materials'].get(material.name, None)
#print("HASHING MATERIAL", material.name) if cached_hash:
hashed_material = generic_fields_hasher_evolved(material, fields_to_ignore_generic, scan_node_tree=scan_node_tree) return cached_hash
#print("HASHED MATERIAL", hashed_material) else:
return str(hashed_material) scan_node_tree = settings.auto_export.materials_in_depth_scan
#print("HASHING MATERIAL", material.name)
hashed_material = generic_fields_hasher_evolved(material, fields_to_ignore_generic, scan_node_tree=scan_node_tree)
#print("HASHED MATERIAL", hashed_material)
hashed_material = str(hashed_material)
cache['materials'][material.name] = hashed_material
return hashed_material
# TODO: this is partially taken from export_materials utilities, perhaps we could avoid having to fetch things multiple times ? # TODO: this is partially taken from export_materials utilities, perhaps we could avoid having to fetch things multiple times ?
def materials_hash(obj, cache, settings): def materials_hash(obj, cache, settings):
@ -286,16 +296,7 @@ def materials_hash(obj, cache, settings):
materials = [] materials = []
for material_slot in obj.material_slots: for material_slot in obj.material_slots:
material = material_slot.material material = material_slot.material
"""cached_hash = cache['materials'].get(material.name, None) mat = material_hash(material, cache, settings)
if cached_hash:
materials.append(cached_hash)
print("CAACHED")
else:
mat = material_hash(material, settings)
cache['materials'][material.name] = mat
materials.append(mat)"""
mat = material_hash(material, settings)
cache['materials'][material.name] = mat
materials.append(mat) materials.append(mat)
return str(h1_hash(str(materials))) return str(h1_hash(str(materials)))
@ -308,7 +309,6 @@ def modifier_hash(modifier_data, settings):
#print("modifier", modifier_data.name, "hashed", hashed_modifier) #print("modifier", modifier_data.name, "hashed", hashed_modifier)
return str(hashed_modifier) return str(hashed_modifier)
def modifiers_hash(object, settings): def modifiers_hash(object, settings):
modifiers = [] modifiers = []
for modifier in object.modifiers: for modifier in object.modifiers:
@ -317,26 +317,22 @@ def modifiers_hash(object, settings):
#print(" ") #print(" ")
return str(h1_hash(str(modifiers))) return str(h1_hash(str(modifiers)))
def serialize_project(settings): def serialize_project(settings):
cache = {"materials":{}} cache = {"materials":{}}
print("serializing scenes") print("serializing project")
per_scene = {} per_scene = {}
for scene in settings.main_scenes + settings.library_scenes: #bpy.data.scenes:
# render settings are injected into each scene
# TODO: only go through scenes actually in our list
for scene in bpy.data.scenes:
print("scene", scene.name) print("scene", scene.name)
# ignore temporary scenes # ignore temporary scenes
if scene.name.startswith(TEMPSCENE_PREFIX): if scene.name.startswith(TEMPSCENE_PREFIX):
continue continue
per_scene[scene.name] = {} per_scene[scene.name] = {}
custom_properties = custom_properties_hash(scene) if len(scene.keys()) > 0 else None custom_properties = custom_properties_hash(scene) if len(scene.keys()) > 0 else None
# render settings are injected into each scene
eevee_settings = generic_fields_hasher_evolved(scene.eevee, fields_to_ignore=fields_to_ignore_generic) # TODO: ignore most of the fields eevee_settings = generic_fields_hasher_evolved(scene.eevee, fields_to_ignore=fields_to_ignore_generic) # TODO: ignore most of the fields
view_settings = generic_fields_hasher_evolved(scene.view_settings, fields_to_ignore=fields_to_ignore_generic) view_settings = generic_fields_hasher_evolved(scene.view_settings, fields_to_ignore=fields_to_ignore_generic)
@ -347,7 +343,6 @@ def serialize_project(settings):
} }
#generic_fields_hasher_evolved(scene.eevee, fields_to_ignore=fields_to_ignore_generic) #generic_fields_hasher_evolved(scene.eevee, fields_to_ignore=fields_to_ignore_generic)
# FIXME: how to deal with this cleanly # FIXME: how to deal with this cleanly
print("SCENE CUSTOM PROPS", custom_properties)
per_scene[scene.name]["____scene_settings"] = str(h1_hash(str(scene_field_hashes))) per_scene[scene.name]["____scene_settings"] = str(h1_hash(str(scene_field_hashes)))
@ -387,39 +382,34 @@ def serialize_project(settings):
objectHash = str(h1_hash(str(object_field_hashes_filtered))) objectHash = str(h1_hash(str(object_field_hashes_filtered)))
per_scene[scene.name][object.name] = objectHash per_scene[scene.name][object.name] = objectHash
per_collection = {} per_collection = {}
# also hash collections (important to catch component changes per blueprints/collections) # also hash collections (important to catch component changes per blueprints/collections)
collections_in_scene = [collection for collection in bpy.data.collections if scene.user_of_id(collection)] # collections_in_scene = [collection for collection in bpy.data.collections if scene.user_of_id(collection)]
for collection in bpy.data.collections:# collections_in_scene: for collection in bpy.data.collections:# collections_in_scene:
#loc, rot, scale = bpy.context.object.matrix_world.decompose() #loc, rot, scale = bpy.context.object.matrix_world.decompose()
#visibility = collection.visible_get() #visibility = collection.visible_get()
custom_properties = custom_properties_hash(collection) if len(collection.keys()) > 0 else None custom_properties = custom_properties_hash(collection) if len(collection.keys()) > 0 else None
# parent = collection.parent.name if collection.parent else None # parent = collection.parent.name if collection.parent else None
#collections = [collection.name for collection in object.users_collection] #collections = [collection.name for collection in object.users_collection]
collection_field_hashes = { collection_field_hashes = {
"name": collection.name, "name": collection.name,
# "visibility": visibility, # "visibility": visibility,
"custom_properties": custom_properties, "custom_properties": custom_properties,
#"parent": parent, #"parent": parent,
#"collections": collections, #"collections": collections,
} }
collection_field_hashes_filtered = {key: collection_field_hashes[key] for key in collection_field_hashes.keys() if collection_field_hashes[key] is not None} collection_field_hashes_filtered = {key: collection_field_hashes[key] for key in collection_field_hashes.keys() if collection_field_hashes[key] is not None}
collectionHash = str(h1_hash(str(collection_field_hashes_filtered)))
per_collection[collection.name] = collectionHash collectionHash = str(h1_hash(str(collection_field_hashes_filtered)))
per_collection[collection.name] = collectionHash
# and also hash materials to avoid constanstly exporting materials libraries, and only # and also hash materials to avoid constanstly exporting materials libraries, and only
# actually this should be similar to change detections for scenes # actually this should be similar to change detections for scenes
per_material = {} per_material = {}
for material in bpy.data.materials: for material in bpy.data.materials:
per_material[material.name] = str(h1_hash(material_hash(material, settings))) per_material[material.name] = str(h1_hash(material_hash(material, cache, settings)))
print("materials_hash", per_material)
"""print("data", data)
print("")
print("")
print("data json", json.dumps(data))"""
return {"scenes": per_scene, "collections": per_collection, "materials": per_material} return {"scenes": per_scene, "collections": per_collection, "materials": per_material}

View File

@ -33,7 +33,6 @@ def inject_export_path_into_internal_blueprints(internal_blueprints, blueprints_
for blueprint in internal_blueprints: for blueprint in internal_blueprints:
blueprint_exported_path = os.path.join(blueprints_path, f"{blueprint.name}{gltf_extension}") blueprint_exported_path = os.path.join(blueprints_path, f"{blueprint.name}{gltf_extension}")
# print("injecting blueprint path", blueprint_exported_path, "for", blueprint.name) # print("injecting blueprint path", blueprint_exported_path, "for", blueprint.name)
print("blueprint_exported_path", blueprint_exported_path)
blueprint.collection["export_path"] = blueprint_exported_path blueprint.collection["export_path"] = blueprint_exported_path
if export_materials_library: if export_materials_library:
blueprint.collection["materials_path"] = materials_exported_path blueprint.collection["materials_path"] = materials_exported_path

View File

@ -37,9 +37,9 @@ def is_scene_already_in_use(self, scene):
return True return True
class BlenvyManager(PropertyGroup): class BlenvyManager(PropertyGroup):
settings_save_path = ".blenvy_common_settings" # where to store data in bpy.texts settings_save_path = ".blenvy_common_settings" # where to store data in bpy.texts
settings_save_enabled: BoolProperty(name="settings save enabled", default=True) # type: ignore settings_save_enabled: BoolProperty(name="settings save enabled", default=True) # type: ignore
scenes_to_scene_names = {} # used to map scenes to scene names to detect scene renames for diffing
mode: EnumProperty( mode: EnumProperty(
items=( items=(
@ -187,3 +187,7 @@ class BlenvyManager(PropertyGroup):
# now load component settings # now load component settings
self.components.load_settings() self.components.load_settings()
for scene in bpy.data.scenes:
self.scenes_to_scene_names[scene] = scene.name