feat(blenvy:blender): a ton of cleanups, fixes & improvements

* fixed bad hashing causing hashed project across two different blender sessions to appear different
 aka, no more systematic re-export of everything when reloading a project in blender !
 * fixed issues with modifier & material hashing that was also causing overly eager change detection
 * previous_xxx_settings are now only saved AFTER a sucessfull export, for coherence
 * added more fine grained setting change detection (aka some setting changes do not require a re-export of all levels & blueprints !)
 * fixed handling of level & library scene names as part of the settings
 * fixed numerous issues with core, auto_export & component settings
 * cleaned up a ton of very verbose debug message
 * BlenvyAssets => BlueprintAssets
 * a lot of minor cleanups
This commit is contained in:
kaosat.dev 2024-06-25 18:27:52 +02:00
parent ee5c74aa9e
commit 31f6a0f122
21 changed files with 212 additions and 204 deletions

View File

@ -139,29 +139,53 @@ General issues:
- [x] overall cleanup - [x] overall cleanup
- [x] object.add_bevy_component => blenvy.component_add - [x] object.add_bevy_component => blenvy.component_add
Blender side:
- [x] force overwrite of settings files instead of partial updates ?
- [x] prevent loop when loading/setting/saving settings
- [x] fix asset changes not being detected as a scene change
- [x] fix scene setting changes not being detected as a scene change
- [x] add back lighting_components
- [x] check if scene components are being deleted through our scene re-orgs in the spawn post process
- [x] fix unreliable project hashing between sessions: (note, it is due to the use of hash() : https://stackoverflow.com/questions/27522626/hash-function-in-python-3-3-returns-different-results-between-sessions)
- [x] figure out why there are still changes per session (it is due to object pointer being present in the generated "hash")
- materials & modifiers, both using the same underlying logic
- [x] filter out components_meta
- [x] filter out xxx_ui propgroups
- [x] fix missing main/lib scene names in blenvy_common_settings
- [x] fix incorect updating of main/lib scenes list in settings
- [ ] and what about scene renames ?? perhaps tigger a forced "save settings" before doing the export ?
- [x] should we write the previous _xxx data only AFTER a sucessfull export only ?
- [x] finer grained control of setting changes to trigger a re-export:
- [x] common: any of them should trigger
- [x] components: none
- [x] auto_export:
- auto_export: yes
- gltf settings: yes
- change detection: no ?
- export blueprints: YES
- export split dynamic/static: YES
- export merge mode : YES
- materials: YES
- [ ] inject_export_path_into_internal_blueprints should be called on every asset/blueprint scan !! Not just on export - [ ] inject_export_path_into_internal_blueprints should be called on every asset/blueprint scan !! Not just on export
- [ ] undo after a save removes any saved "serialized scene" data ? DIG into this - [ ] undo after a save removes any saved "serialized scene" data ? DIG into this
- [ ] handle scene renames between saves (breaks diffing) => very hard to achieve - [ ] handle scene renames between saves (breaks diffing) => very hard to achieve
- [ ] force overwrite of settings files instead of partial updates ?
- [ ] add tests for disabled components - [ ] add tests for disabled components
- [ ] should we write the previous _xxx data only AFTER a sucessfull export only ? - [ ] find a solution for the new color handling
- [ ] hidden objects/collections not respected at export !!!
- [ ] add option to 'split out' meshes from blueprints ? - [ ] add option to 'split out' meshes from blueprints ?
- [ ] ie considering meshletts etc , it would make sense to keep blueprints seperate from purely mesh gltfs - [ ] ie considering meshletts etc , it would make sense to keep blueprints seperate from purely mesh gltfs
- [ ] persist exported materials path in blueprints so that it can be read from library file users - [ ] persist exported materials path in blueprints so that it can be read from library file users
- [ ] just like "export_path" write it into each blueprint's collection - [ ] just like "export_path" write it into each blueprint's collection
- [ ] scan for used materials per blueprint ! - [ ] scan for used materials per blueprint !
- [ ] for scenes, scan for used materials of all non instance objects (TODO: what about overrides ?) - [ ] for scenes, scan for used materials of all non instance objects (TODO: what about overrides ?)
- [ ] find a solution for the new color handling
- [x] add back lighting_components
- [x] check if scene components are being deleted through our scene re-orgs in the spawn post process
- [ ] should "blueprint spawned" only be triggered after all its sub blueprints have spawned ?
Bevy Side:
- [x] deprecate BlueprintName & BlueprintPath & use BlueprintInfo instead
- [ ] should "blueprint spawned" only be triggered after all its sub blueprints have spawned ?
- [ ] simplify testing example: - [ ] simplify testing example:
- [x] remove use of rapier physics (or even the whole common boilerplate ?) - [x] remove use of rapier physics (or even the whole common boilerplate ?)
- [ ] remove/replace bevy editor pls with some native ui to display hierarchies - [ ] remove/replace bevy editor pls with some native ui to display hierarchies
- [ ] try out hot reloading - [x] try out hot reloading
- [ ] simplify examples: - [ ] simplify examples:
- [ ] a full fledged demo (including physics & co) - [ ] a full fledged demo (including physics & co)
- [ ] other examples without interactions or physics - [ ] other examples without interactions or physics
@ -170,12 +194,9 @@ General issues:
- [ ] replace all references to the old 2 add-ons with those to Blenvy - [ ] replace all references to the old 2 add-ons with those to Blenvy
- [ ] rename repo to "Blenvy" - [ ] rename repo to "Blenvy"
- [ ] do a deprecation release of all bevy_gltf_xxx crates to point at the new Blenvy crate - [ ] do a deprecation release of all bevy_gltf_xxx crates to point at the new Blenvy crate
- [ ] hidden objects/collections not respected at export !!!
- [ ] add a way of overriding assets for collection instances - [ ] add a way of overriding assets for collection instances
- [ ] add a way of visualizing per blueprint instances - [ ] add a way of visualizing per blueprint instances
- [ ] cleanup all the spurious debug messages - [ ] cleanup all the spurious debug messages
- [ ] deprecate BlueprintName & BlueprintPath & use BlueprintInfo instead
- [ ] fix animation handling - [ ] fix animation handling
clear && pytest -svv --blender-template ../../testing/bevy_example/art/testing_library.blend --blender-executable /home/ckaos/tools/blender/blender-4.1.0-linux-x64/blender tests/test_bevy_integration_prepare.py && pytest -svv --blender-executable /home/ckaos/tools/blender/blender-4.1.0-linux-x64/blender tests/test_bevy_integration.py clear && pytest -svv --blender-template ../../testing/bevy_example/art/testing_library.blend --blender-executable /home/ckaos/tools/blender/blender-4.1.0-linux-x64/blender tests/test_bevy_integration_prepare.py && pytest -svv --blender-executable /home/ckaos/tools/blender/blender-4.1.0-linux-x64/blender tests/test_bevy_integration.py

View File

@ -11,7 +11,7 @@ def cleanup_file():
os.remove(gltf_filepath) os.remove(gltf_filepath)
return None return None
else: else:
return 1 return 1.0
def gltf_post_export_callback(data): def gltf_post_export_callback(data):
#print("post_export", data) #print("post_export", data)

View File

@ -36,14 +36,9 @@ def export_blueprints(blueprints, settings, blueprints_data):
collection = bpy.data.collections[blueprint.name] collection = bpy.data.collections[blueprint.name]
print("BLUEPRINT", blueprint.name)
for asset in collection.user_assets:
print(" user asset", asset.name, asset.path)
all_assets = [] all_assets = []
auto_assets = [] auto_assets = []
collection["BlenvyAssets"] = assets_to_fake_ron([]) #assets_to_fake_ron([{"name": asset.name, "path": asset.path} for asset in collection.user_assets] + auto_assets) #all_assets + [{"name": asset.name, "path": asset.path} for asset in collection.user_assets] + auto_assets) collection["BlueprintAssets"] = assets_to_fake_ron([]) #assets_to_fake_ron([{"name": asset.name, "path": asset.path} for asset in collection.user_assets] + auto_assets) #all_assets + [{"name": asset.name, "path": asset.path} for asset in collection.user_assets] + auto_assets)
# do the actual export # do the actual export

View File

@ -51,7 +51,6 @@ def auto_export(changes_per_scene, changed_export_parameters, settings):
for blueprint in blueprints_data.blueprints: for blueprint in blueprints_data.blueprints:
bpy.context.window_manager.blueprints_registry.add_blueprint(blueprint) bpy.context.window_manager.blueprints_registry.add_blueprint(blueprint)
#bpy.context.window_manager.blueprints_registry.refresh_blueprints() #bpy.context.window_manager.blueprints_registry.refresh_blueprints()
print("YO YO")
if export_scene_settings: if export_scene_settings:
# inject/ update scene components # inject/ update scene components

View File

@ -90,7 +90,7 @@ def duplicate_object(object, parent, combine_mode, destination_collection, bluep
original_name = object.name original_name = object.name
blueprint_name = original_collection.name blueprint_name = original_collection.name
# FIXME: blueprint path is WRONG ! # FIXME: blueprint path is WRONG !
print("BLUEPRINT PATH", original_collection.get('export_path', None)) # print("BLUEPRINT PATH", original_collection.get('export_path', None))
blueprint_path = original_collection['export_path'] if 'export_path' in original_collection else f'./{blueprint_name}' # TODO: the default requires the currently used extension !! blueprint_path = original_collection['export_path'] if 'export_path' in original_collection else f'./{blueprint_name}' # TODO: the default requires the currently used extension !!

View File

@ -19,7 +19,6 @@ def generate_temporary_scene_and_export(settings, gltf_export_settings, gltf_out
temp_scene = bpy.data.scenes.new(name=temp_scene_name) temp_scene = bpy.data.scenes.new(name=temp_scene_name)
temp_root_collection = temp_scene.collection temp_root_collection = temp_scene.collection
print("additional_dataAAAAAAAAAAAAAAAH", additional_data)
properties_black_list = ['glTF2ExportSettings', 'assets', 'user_assets', 'components_meta', 'Components_meta', 'Generated_assets', 'generated_assets'] properties_black_list = ['glTF2ExportSettings', 'assets', 'user_assets', 'components_meta', 'Components_meta', 'Generated_assets', 'generated_assets']
if additional_data is not None: # FIXME not a fan of having this here if additional_data is not None: # FIXME not a fan of having this here
for entry in dict(additional_data): for entry in dict(additional_data):

View File

@ -3,6 +3,7 @@ import bpy
from .project_diff import get_changes_per_scene from .project_diff import get_changes_per_scene
from .auto_export import auto_export from .auto_export import auto_export
from .settings_diff import get_setting_changes from .settings_diff import get_setting_changes
from blenvy.settings import load_settings, upsert_settings
# prepare export by gather the changes to the scenes & settings # prepare export by gather the changes to the scenes & settings
def prepare_and_export(): def prepare_and_export():
@ -13,17 +14,28 @@ def prepare_and_export():
if auto_export_settings.auto_export: # only do the actual exporting if auto export is actually enabled if auto_export_settings.auto_export: # only do the actual exporting if auto export is actually enabled
# determine changed objects # determine changed objects
per_scene_changes = get_changes_per_scene(settings=blenvy) per_scene_changes, project_hash = get_changes_per_scene(settings=blenvy)
# determine changed parameters # determine changed parameters
setting_changes = get_setting_changes() setting_changes, current_common_settings, current_export_settings, current_gltf_settings = get_setting_changes()
print("setting_changes", setting_changes) print("changes: settings:", setting_changes)
print("changes: scenes:", per_scene_changes)
print("project_hash", project_hash)
# do the actual export # do the actual export
# blenvy.auto_export.dry_run = 'NO_EXPORT'#'DISABLED'# # blenvy.auto_export.dry_run = 'NO_EXPORT'#'DISABLED'#
auto_export(per_scene_changes, setting_changes, blenvy) auto_export(per_scene_changes, setting_changes, blenvy)
# -------------------------------------
# now that this point is reached, the export should have run correctly, so we can save all the current state to the "previous one"
# save the current project hash as previous
upsert_settings(".blenvy.project_serialized_previous", project_hash, overwrite=True)
# write the new settings to the old settings
upsert_settings(".blenvy_common_settings_previous", current_common_settings, overwrite=True)
upsert_settings(".blenvy_export_settings_previous", current_export_settings, overwrite=True)
upsert_settings(".blenvy_gltf_settings_previous", current_gltf_settings, overwrite=True)
# cleanup # cleanup
# TODO: these are likely obsolete # TODO: these are likely obsolete
# reset the list of changes in the tracker # reset the list of changes in the tracker
#bpy.context.window_manager.auto_export_tracker.clear_changes() #bpy.context.window_manager.auto_export_tracker.clear_changes()
print("AUTO EXPORT DONE") print("AUTO EXPORT DONE")
#bpy.app.timers.register(bpy.context.window_manager.auto_export_tracker.enable_change_detection, first_interval=0.1)

View File

@ -22,6 +22,7 @@ def serialize_current(settings):
current_scene = bpy.context.window.scene current_scene = bpy.context.window.scene
bpy.context.window.scene = bpy.data.scenes[0] bpy.context.window.scene = bpy.data.scenes[0]
#serialize scene at frame 0 #serialize scene at frame 0
# TODO: add back
"""with bpy.context.temp_override(scene=bpy.data.scenes[1]): """with bpy.context.temp_override(scene=bpy.data.scenes[1]):
bpy.context.scene.frame_set(0)""" bpy.context.scene.frame_set(0)"""
@ -38,7 +39,6 @@ def get_changes_per_scene(settings):
previous = load_settings(".blenvy.project_serialized_previous") previous = load_settings(".blenvy.project_serialized_previous")
current = serialize_current(settings) current = serialize_current(settings)
# determine changes # determine changes
changes_per_scene = {} changes_per_scene = {}
try: try:
@ -46,11 +46,7 @@ def get_changes_per_scene(settings):
except Exception as error: except Exception as error:
print("failed to compare current serialized scenes to previous ones", error) print("failed to compare current serialized scenes to previous ones", error)
# save the current project as previous return changes_per_scene, current
upsert_settings(".blenvy.project_serialized_previous", current, overwrite=True)
print("changes per scene", changes_per_scene)
return changes_per_scene
def project_diff(previous, current, settings): def project_diff(previous, current, settings):
@ -58,17 +54,13 @@ def project_diff(previous, current, settings):
print("current", current)""" print("current", current)"""
if previous is None or current is None: if previous is None or current is None:
return {} return {}
print("Settings", settings,"current", current, "previous", previous)
changes_per_scene = {} changes_per_scene = {}
# TODO : how do we deal with changed scene names ??? # TODO : how do we deal with changed scene names ???
# possible ? on each save, inject an id into each scene, that cannot be copied over # possible ? on each save, inject an id into each scene, that cannot be copied over
print('TEST SCENE', bpy.data.scenes.get("ULTRA LEVEL2"), None)
for scene in current: for scene in current:
print("SCENE", scene)
current_object_names =list(current[scene].keys()) current_object_names =list(current[scene].keys())
if scene in previous: # we can only compare scenes that are in both previous and current data if scene in previous: # we can only compare scenes that are in both previous and current data

View File

@ -7,13 +7,23 @@ import numpy as np
import bpy import bpy
from ..constants import TEMPSCENE_PREFIX from ..constants import TEMPSCENE_PREFIX
import hashlib
# horrible and uneficient
def h1_hash(w):
try:
w = w.encode('utf-8')
except: pass
return hashlib.md5(w).hexdigest()
fields_to_ignore_generic = [ fields_to_ignore_generic = [
"tag", "type", "update_tag", "use_extra_user", "use_fake_user", "user_clear", "user_of_id", "user_remap", "users", "tag", "type", "update_tag", "use_extra_user", "use_fake_user", "user_clear", "user_of_id", "user_remap", "users",
'animation_data_clear', 'animation_data_create', 'asset_clear', 'asset_data', 'asset_generate_preview', 'asset_mark', 'bl_rna', 'evaluated_get', 'animation_data_clear', 'animation_data_create', 'asset_clear', 'asset_data', 'asset_generate_preview', 'asset_mark', 'bl_rna', 'evaluated_get',
'library', 'library_weak_reference', 'make_local','name', 'name_full', 'original', 'library', 'library_weak_reference', 'make_local','name', 'name_full', 'original',
'override_create', 'override_hierarchy_create', 'override_library', 'preview', 'preview_ensure', 'rna_type', 'override_create', 'override_hierarchy_create', 'override_library', 'preview', 'preview_ensure', 'rna_type',
'session_uid', 'copy', 'id_type', 'is_embedded_data', 'is_evaluated', 'is_library_indirect', 'is_missing', 'is_runtime_data' 'session_uid', 'copy', 'id_type', 'is_embedded_data', 'is_evaluated', 'is_library_indirect', 'is_missing', 'is_runtime_data',
'components_meta', 'cycles'
] ]
@ -40,9 +50,7 @@ def _lookup_array2(data):
return peel_value(data) return peel_value(data)
def _lookup_prop_group(data): def _lookup_prop_group(data):
bla = generic_fields_hasher_evolved(data, fields_to_ignore=fields_to_ignore_generic) return generic_fields_hasher_evolved(data, fields_to_ignore=fields_to_ignore_generic)
print("PROPGROUP", bla)
return bla
def _lookup_collection(data): def _lookup_collection(data):
return [generic_fields_hasher_evolved(item, fields_to_ignore=fields_to_ignore_generic) for item in data] return [generic_fields_hasher_evolved(item, fields_to_ignore=fields_to_ignore_generic) for item in data]
@ -50,9 +58,16 @@ def _lookup_collection(data):
def _lookup_materialLineArt(data): def _lookup_materialLineArt(data):
return generic_fields_hasher_evolved(data, fields_to_ignore=fields_to_ignore_generic) return generic_fields_hasher_evolved(data, fields_to_ignore=fields_to_ignore_generic)
def _lookup_object(data):
return data.name
return generic_fields_hasher_evolved(data, fields_to_ignore=fields_to_ignore_generic)
def _lookup_generic(data):
return generic_fields_hasher_evolved(data, fields_to_ignore=fields_to_ignore_generic)
# used for various node trees: shaders, modifiers etc # used for various node trees: shaders, modifiers etc
def node_tree(node_tree): def node_tree(node_tree):
print("SCANNING NODE TREE", node_tree) #print("SCANNING NODE TREE", node_tree)
# storage for hashing # storage for hashing
links_hashes = [] links_hashes = []
@ -99,18 +114,19 @@ def node_tree(node_tree):
links_hashes.append(link_hash) links_hashes.append(link_hash)
#print("node hashes",nodes_hashes, "links_hashes", links_hashes) #print("node hashes",nodes_hashes, "links_hashes", links_hashes)
print("root_inputs", root_inputs)
return f"{str(root_inputs)}_{str(nodes_hashes)}_{str(links_hashes)}" return f"{str(root_inputs)}_{str(nodes_hashes)}_{str(links_hashes)}"
type_lookups = { type_lookups = {
Color: _lookup_color,#lambda input: print("dsf")', Color: _lookup_color,#lambda input: print("dsf")',
bpy.types.Object: _lookup_object,
bpy.types.FloatVectorAttribute: _lookup_array2, bpy.types.FloatVectorAttribute: _lookup_array2,
bpy.types.bpy_prop_array: _lookup_array, bpy.types.bpy_prop_array: _lookup_array,
bpy.types.PropertyGroup: _lookup_prop_group, bpy.types.PropertyGroup: _lookup_prop_group,
bpy.types.bpy_prop_collection: _lookup_collection, bpy.types.bpy_prop_collection: _lookup_collection,
bpy.types.MaterialLineArt: _lookup_materialLineArt, bpy.types.MaterialLineArt: _lookup_materialLineArt,
bpy.types.NodeTree: node_tree, bpy.types.NodeTree: node_tree,
bpy.types.CurveProfile: _lookup_generic
} }
def convert_field(raw_value, field_name="", scan_node_tree=True): def convert_field(raw_value, field_name="", scan_node_tree=True):
@ -122,6 +138,7 @@ def convert_field(raw_value, field_name="", scan_node_tree=True):
conversion_lookup = None # type_lookups.get(type(raw_value), None) conversion_lookup = None # type_lookups.get(type(raw_value), None)
all_types = inspect.getmro(type(raw_value)) all_types = inspect.getmro(type(raw_value))
for s_type in all_types: for s_type in all_types:
#print(" stype", s_type)
if type_lookups.get(s_type, None) is not None: if type_lookups.get(s_type, None) is not None:
conversion_lookup = type_lookups[s_type] conversion_lookup = type_lookups[s_type]
break break
@ -132,6 +149,9 @@ def convert_field(raw_value, field_name="", scan_node_tree=True):
#print("field_name",field_name,"conv value", field_value) #print("field_name",field_name,"conv value", field_value)
else: else:
#print("field_name",field_name,"raw value", raw_value) #print("field_name",field_name,"raw value", raw_value)
"""try:
field_value=_lookup_generic(raw_value)
except:pass"""
field_value = raw_value field_value = raw_value
return field_value return field_value
@ -146,6 +166,7 @@ def obj_to_dict(object):
# TODO: replace the first one with this once if its done # TODO: replace the first one with this once if its done
def generic_fields_hasher_evolved(data, fields_to_ignore, scan_node_tree=True): def generic_fields_hasher_evolved(data, fields_to_ignore, scan_node_tree=True):
dict_data = obj_to_dict(data) # in some cases, some data is in the key/value pairs of the object dict_data = obj_to_dict(data) # in some cases, some data is in the key/value pairs of the object
dict_data = {key: dict_data[key] for key in dict_data.keys() if key not in fields_to_ignore}# we need to filter out fields here too
all_field_names = dir(data) all_field_names = dir(data)
field_values = [] field_values = []
for field_name in all_field_names: for field_name in all_field_names:
@ -153,6 +174,8 @@ def generic_fields_hasher_evolved(data, fields_to_ignore, scan_node_tree=True):
raw_value = getattr(data, field_name, None) raw_value = getattr(data, field_name, None)
#print("raw value", raw_value, "type", type(raw_value), isinstance(raw_value, Color), isinstance(raw_value, bpy.types.bpy_prop_array)) #print("raw value", raw_value, "type", type(raw_value), isinstance(raw_value, Color), isinstance(raw_value, bpy.types.bpy_prop_array))
field_value = convert_field(raw_value, field_name, scan_node_tree) field_value = convert_field(raw_value, field_name, scan_node_tree)
#print("field name", field_name, "raw", raw_value, "converted", field_value)
field_values.append(str(field_value)) field_values.append(str(field_value))
return str(dict_data) + str(field_values) return str(dict_data) + str(field_values)
@ -163,7 +186,7 @@ def mesh_hash(obj):
vertex_count = len(obj.data.vertices) vertex_count = len(obj.data.vertices)
vertices_np = np.empty(vertex_count * 3, dtype=np.float32) vertices_np = np.empty(vertex_count * 3, dtype=np.float32)
obj.data.vertices.foreach_get("co", vertices_np) obj.data.vertices.foreach_get("co", vertices_np)
h = str(hash(vertices_np.tobytes())) h = str(h1_hash(vertices_np.tobytes()))
return h return h
# TODO: redo this one, this is essentially modifiec copy & pasted data, not fitting # TODO: redo this one, this is essentially modifiec copy & pasted data, not fitting
@ -202,7 +225,7 @@ def animation_hash(obj):
markers_per_animation[animation_name][marker.frame] = [] markers_per_animation[animation_name][marker.frame] = []
markers_per_animation[animation_name][marker.frame].append(marker.name) markers_per_animation[animation_name][marker.frame].append(marker.name)
compact_result = hash(str((blender_actions, blender_tracks, markers_per_animation, animations_infos))) compact_result = h1_hash(str((blender_actions, blender_tracks, markers_per_animation, animations_infos)))
return compact_result return compact_result
@ -213,7 +236,7 @@ def custom_properties_hash(obj):
for property_name in obj.keys(): for property_name in obj.keys():
if property_name not in '_RNA_UI' and property_name != 'components_meta': if property_name not in '_RNA_UI' and property_name != 'components_meta':
custom_properties[property_name] = obj[property_name] custom_properties[property_name] = obj[property_name]
return str(hash(str(custom_properties))) return str(h1_hash(str(custom_properties)))
def camera_hash(obj): def camera_hash(obj):
camera_data = obj.data camera_data = obj.data
@ -233,7 +256,7 @@ def bones_hash(bones):
fields = [getattr(bone, prop, None) for prop in all_field_names if not prop.startswith("__") and not prop in fields_to_ignore and not prop.startswith("show_")] fields = [getattr(bone, prop, None) for prop in all_field_names if not prop.startswith("__") and not prop in fields_to_ignore and not prop.startswith("show_")]
bones_result.append(fields) bones_result.append(fields)
#print("fields of bone", bones_result) #print("fields of bone", bones_result)
return str(hash(str(bones_result))) return str(h1_hash(str(bones_result)))
# fixme: not good enough ? # fixme: not good enough ?
def armature_hash(obj): def armature_hash(obj):
@ -250,8 +273,10 @@ def armature_hash(obj):
def material_hash(material, settings): def material_hash(material, settings):
scan_node_tree = settings.auto_export.materials_in_depth_scan scan_node_tree = settings.auto_export.materials_in_depth_scan
hashed_material_except_node_tree = generic_fields_hasher_evolved(material, fields_to_ignore_generic, scan_node_tree=scan_node_tree) #print("HASHING MATERIAL", material.name)
return str(hashed_material_except_node_tree) hashed_material = generic_fields_hasher_evolved(material, fields_to_ignore_generic, scan_node_tree=scan_node_tree)
#print("HASHED MATERIAL", hashed_material)
return str(hashed_material)
# TODO: this is partially taken from export_materials utilities, perhaps we could avoid having to fetch things multiple times ? # TODO: this is partially taken from export_materials utilities, perhaps we could avoid having to fetch things multiple times ?
def materials_hash(obj, cache, settings): def materials_hash(obj, cache, settings):
@ -271,21 +296,23 @@ def materials_hash(obj, cache, settings):
cache['materials'][material.name] = mat cache['materials'][material.name] = mat
materials.append(mat) materials.append(mat)
return str(hash(str(materials))) return str(h1_hash(str(materials)))
def modifier_hash(modifier_data, settings): def modifier_hash(modifier_data, settings):
scan_node_tree = settings.auto_export.modifiers_in_depth_scan scan_node_tree = settings.auto_export.modifiers_in_depth_scan
#print("HASHING MODIFIER", modifier_data.name)
hashed_modifier = generic_fields_hasher_evolved(modifier_data, fields_to_ignore_generic, scan_node_tree=scan_node_tree) hashed_modifier = generic_fields_hasher_evolved(modifier_data, fields_to_ignore_generic, scan_node_tree=scan_node_tree)
#print("modifier", modifier_data.name, "hashed", hashed_modifier)
return str(hashed_modifier) return str(hashed_modifier)
def modifiers_hash(object, settings): def modifiers_hash(object, settings):
modifiers = [] modifiers = []
for modifier in object.modifiers: for modifier in object.modifiers:
print("modifier", modifier )# modifier.node_group) #print("modifier", modifier )# modifier.node_group)
modifiers.append(modifier_hash(modifier, settings)) modifiers.append(modifier_hash(modifier, settings))
print(" ") #print(" ")
return str(hash(str(modifiers))) return str(h1_hash(str(modifiers)))
def serialize_scene(settings): def serialize_scene(settings):
cache = {"materials":{}} cache = {"materials":{}}
@ -309,16 +336,15 @@ def serialize_scene(settings):
"custom_properties": custom_properties, "custom_properties": custom_properties,
"eevee": eevee_settings "eevee": eevee_settings
} }
print("SCENE WORLD", scene.world, dir(scene.eevee))
#generic_fields_hasher_evolved(scene.eevee, fields_to_ignore=fields_to_ignore_generic) #generic_fields_hasher_evolved(scene.eevee, fields_to_ignore=fields_to_ignore_generic)
data[scene.name]["____scene_settings"] = str(hash(str(scene_field_hashes))) # FIXME: how to deal with this cleanly
print("SCENE CUSTOM PROPS", custom_properties)
data[scene.name]["____scene_settings"] = str(h1_hash(str(scene_field_hashes)))
for object in scene.objects: for object in scene.objects:
object = bpy.data.objects[object.name] object = bpy.data.objects[object.name]
#loc, rot, scale = bpy.context.object.matrix_world.decompose() #loc, rot, scale = bpy.context.object.matrix_world.decompose()
transform = str((object.location, object.rotation_euler, object.scale)) #str((object.matrix_world.to_translation(), object.matrix_world.to_euler('XYZ'), object.matrix_world.to_quaternion()))# transform = str((object.location, object.rotation_euler, object.scale)) #str((object.matrix_world.to_translation(), object.matrix_world.to_euler('XYZ'), object.matrix_world.to_quaternion()))#
visibility = object.visible_get() visibility = object.visible_get()
custom_properties = custom_properties_hash(object) if len(object.keys()) > 0 else None custom_properties = custom_properties_hash(object) if len(object.keys()) > 0 else None
@ -332,7 +358,6 @@ def serialize_scene(settings):
materials = materials_hash(object, cache, settings) if len(object.material_slots) > 0 else None materials = materials_hash(object, cache, settings) if len(object.material_slots) > 0 else None
modifiers = modifiers_hash(object, settings) if len(object.modifiers) > 0 else None modifiers = modifiers_hash(object, settings) if len(object.modifiers) > 0 else None
object_field_hashes = { object_field_hashes = {
"name": object.name, "name": object.name,
"transforms": transform, "transforms": transform,
@ -348,8 +373,9 @@ def serialize_scene(settings):
"materials": materials, "materials": materials,
"modifiers":modifiers "modifiers":modifiers
} }
object_field_hashes_filtered = {key: object_field_hashes[key] for key in object_field_hashes.keys() if object_field_hashes[key] is not None} object_field_hashes_filtered = {key: object_field_hashes[key] for key in object_field_hashes.keys() if object_field_hashes[key] is not None}
objectHash = str(hash(str(object_field_hashes_filtered))) objectHash = str(h1_hash(str(object_field_hashes_filtered)))
data[scene.name][object.name] = objectHash data[scene.name][object.name] = objectHash
"""print("data", data) """print("data", data)

View File

@ -1,56 +0,0 @@
#print("THIS IS A GEOMETRY NODE")
# storage for hashing
links_hashes = []
nodes_hashes = []
modifier_inputs = dict(modifier_data)
for node in node_group.nodes:
#print("node", node, node.type, node.name, node.label)
#print("node info", dir(node))
input_hashes = []
for input in node.inputs:
#print(" input", input, "label", input.label, "name", input.name)
input_hash = f"{getattr(input, 'default_value', None)}"
input_hashes.append(input_hash)
"""if hasattr(input, "default_value"):
print("YOHO", dict(input), input.default_value)"""
output_hashes = []
# IF the node itself is a group input, its outputs are the inputs of the geometry node (yes, not easy)
node_in_use = True
for (index, output) in enumerate(node.outputs):
# print(" output", output, "label", output.label, "name", output.name, "generated name", f"Socket_{index+1}")
output_hash = f"{getattr(output, 'default_value', None)}"
output_hashes.append(output_hash)
"""if hasattr(output, "default_value"):
print("YOHO", output.default_value)"""
node_in_use = node_in_use and hasattr(output, "default_value")
#print("NODE IN USE", node_in_use)
node_fields_to_ignore = fields_to_ignore_generic + ['internal_links', 'inputs', 'outputs']
node_hash = f"{generic_fields_hasher(node, node_fields_to_ignore)}_{str(input_hashes)}_{str(output_hashes)}"
#print("node hash", node_hash)
nodes_hashes.append(node_hash)
#print(" ")
for link in node_group.links:
"""print("LINK", link) #dir(link)
print("FROM", link.from_node, link.from_socket)
print("TO", link.to_node, link.to_socket)"""
from_socket_default = link.from_socket.default_value if hasattr(link.from_socket, "default_value") else None
to_socket_default = link.to_socket.default_value if hasattr(link.to_socket, "default_value") else None
link_hash = f"{link.from_node.name}_{link.from_socket.name}_{from_socket_default}+{link.to_node.name}_{link.to_socket.name}_{to_socket_default}"
"""if hasattr(link.from_socket, "default_value"):
print("[FROM SOCKET]", link.from_socket.default_value)
if hasattr(link.to_socket, "default_value"):
print("[TO SOCKET]", link.to_socket.default_value)"""
links_hashes.append(link_hash)
#print("link_hash", link_hash)
return f"{str(modifier_inputs)}_{str(nodes_hashes)}_{str(links_hashes)}"

View File

@ -1,8 +1,8 @@
import bpy import bpy
from blenvy.settings import are_settings_identical, load_settings, upsert_settings from blenvy.settings import are_settings_identical, load_settings, changed_settings
# which settings are specific to auto_export # TODO: can we infer this ? # which common settings changes should trigger a re-export
parameter_names_whitelist_common = [ parameter_names_whitelist_common = [
# blenvy core # blenvy core
'project_root_path', 'project_root_path',
@ -14,6 +14,7 @@ parameter_names_whitelist_common = [
'library_scene_names', 'library_scene_names',
] ]
# which auto export settings changes should trigger a re-export
parameter_names_whitelist_auto_export = [ parameter_names_whitelist_auto_export = [
# auto export # auto export
'export_scene_settings', 'export_scene_settings',
@ -24,34 +25,29 @@ parameter_names_whitelist_auto_export = [
] ]
def get_setting_changes(): def get_setting_changes():
print("get setting changes") previous_common_settings = load_settings(".blenvy_common_settings_previous")
current_common_settings = load_settings(".blenvy_common_settings")
changed_common_settings_fields = changed_settings(previous_common_settings, current_common_settings, white_list=parameter_names_whitelist_common)
common_settings_changed = len(changed_common_settings_fields) > 0
previous_common_settings = load_settings(".blenvy_common_settings_previous") previous_export_settings = load_settings(".blenvy_export_settings_previous")
current_common_settings = load_settings(".blenvy_common_settings") current_export_settings = load_settings(".blenvy_export_settings")
common_settings_changed = not are_settings_identical(previous_common_settings, current_common_settings, white_list=parameter_names_whitelist_common) changed_export_settings_fields = changed_settings(previous_export_settings, current_export_settings, white_list=parameter_names_whitelist_auto_export)
export_settings_changed = len(changed_export_settings_fields) > 0
previous_export_settings = load_settings(".blenvy_export_settings_previous") previous_gltf_settings = load_settings(".blenvy_gltf_settings_previous")
current_export_settings = load_settings(".blenvy_export_settings") current_gltf_settings = load_settings(".blenvy_gltf_settings")
export_settings_changed = not are_settings_identical(previous_export_settings, current_export_settings, white_list=parameter_names_whitelist_auto_export) gltf_settings_changed = not are_settings_identical(previous_gltf_settings, current_gltf_settings)
previous_gltf_settings = load_settings(".blenvy_gltf_settings_previous") settings_changed = common_settings_changed or gltf_settings_changed or export_settings_changed
current_gltf_settings = load_settings(".blenvy_gltf_settings")
print("previous_gltf_settings", previous_gltf_settings, "current_gltf_settings", current_gltf_settings)
gltf_settings_changed = not are_settings_identical(previous_gltf_settings, current_gltf_settings)
# write the new settings to the old settings # if there were no setting before, it is new, we need export # TODO: do we even need this ? I guess in the case where both the previous & the new one are both none ? very unlikely, but still
upsert_settings(".blenvy_common_settings_previous", current_common_settings, overwrite=True) if previous_common_settings is None:
upsert_settings(".blenvy_export_settings_previous", current_export_settings, overwrite=True) settings_changed = True
upsert_settings(".blenvy_gltf_settings_previous", current_gltf_settings, overwrite=True) if previous_export_settings is None:
settings_changed = True
if previous_gltf_settings is None:
settings_changed = True
print("common_settings_changed", common_settings_changed,"export_settings_changed", export_settings_changed, "gltf_settings_changed", gltf_settings_changed, )
# if there were no setting before, it is new, we need export # TODO: do we even need this ? I guess in the case where both the previous & the new one are both none ? very unlikely, but still return settings_changed, current_common_settings, current_export_settings, current_gltf_settings
if previous_common_settings is None:
return True
if previous_export_settings is None:
return True
if previous_gltf_settings is None:
return True
return common_settings_changed or gltf_settings_changed or export_settings_changed

View File

@ -41,11 +41,11 @@ def export_main_scene(scene, settings, blueprints_data):
gltf_output_path = os.path.join(levels_path_full, scene.name) gltf_output_path = os.path.join(levels_path_full, scene.name)
inject_blueprints_list_into_main_scene(scene, blueprints_data, settings) inject_blueprints_list_into_main_scene(scene, blueprints_data, settings)
print("main scene", scene) """print("main scene", scene)
for asset in scene.user_assets: for asset in scene.user_assets:
print(" user asset", asset.name, asset.path) print(" user asset", asset.name, asset.path)
for asset in scene.generated_assets: for asset in scene.generated_assets:
print(" generated asset", asset) print(" generated asset", asset)"""
"""for blueprint in blueprints_data.blueprints_per_scenes[scene.name]: """for blueprint in blueprints_data.blueprints_per_scenes[scene.name]:
print("BLUEPRINT", blueprint)""" print("BLUEPRINT", blueprint)"""
blueprint_instances_in_scene = blueprints_data.blueprint_instances_per_main_scene.get(scene.name, {}).keys() blueprint_instances_in_scene = blueprints_data.blueprint_instances_per_main_scene.get(scene.name, {}).keys()
@ -69,7 +69,7 @@ def export_main_scene(scene, settings, blueprints_data):
# now also add the assets of the blueprints # TODO: wait no , these should not be a part of the (scene) local assets # now also add the assets of the blueprints # TODO: wait no , these should not be a part of the (scene) local assets
for asset in blueprint.collection.user_assets: for asset in blueprint.collection.user_assets:
print("adding assets of blueprint", asset.name) #print("adding assets of blueprint", asset.name)
all_assets.append({"name": asset.name, "path": asset.path}) all_assets.append({"name": asset.name, "path": asset.path})
"""for asset in auto_assets: """for asset in auto_assets:
@ -81,8 +81,8 @@ def export_main_scene(scene, settings, blueprints_data):
materials_exported_path = os.path.join(materials_path, f"{materials_library_name}{export_gltf_extension}") materials_exported_path = os.path.join(materials_path, f"{materials_library_name}{export_gltf_extension}")
material_assets = [{"name": materials_library_name, "path": materials_exported_path}] # we also add the material library as an asset material_assets = [{"name": materials_library_name, "path": materials_exported_path}] # we also add the material library as an asset
scene["BlenvyAssets"] = assets_to_fake_ron(all_assets + [{"name": asset.name, "path": asset.path} for asset in scene.user_assets] + auto_assets + material_assets) scene["BlueprintAssets"] = assets_to_fake_ron(all_assets + [{"name": asset.name, "path": asset.path} for asset in scene.user_assets] + auto_assets + material_assets)
#scene["BlenvyAssets"] = assets_to_fake_ron([{'name':'foo', 'path':'bar'}]) #scene["BlueprintAssets"] = assets_to_fake_ron([{'name':'foo', 'path':'bar'}])
if export_separate_dynamic_and_static_objects: if export_separate_dynamic_and_static_objects:
#print("SPLIT STATIC AND DYNAMIC") #print("SPLIT STATIC AND DYNAMIC")

View File

@ -9,13 +9,12 @@ settings_black_list = ['settings_save_enabled', 'dry_run']
def save_settings(settings, context): def save_settings(settings, context):
if settings.settings_save_enabled: if settings.settings_save_enabled:
settings_dict = generate_complete_settings_dict(settings, AutoExportSettings, []) settings_dict = generate_complete_settings_dict(settings, AutoExportSettings, [])
print("save settings", settings, context, settings_dict)
upsert_settings(settings.settings_save_path, {key: settings_dict[key] for key in settings_dict.keys() if key not in settings_black_list}, overwrite=True) upsert_settings(settings.settings_save_path, {key: settings_dict[key] for key in settings_dict.keys() if key not in settings_black_list}, overwrite=True)
class AutoExportSettings(PropertyGroup): class AutoExportSettings(PropertyGroup):
settings_save_path = ".blenvy_export_settings" # where to store data in bpy.texts settings_save_path = ".blenvy_export_settings" # where to store data in bpy.texts
settings_save_enabled = BoolProperty(name="settings save enabled", default=True) settings_save_enabled: BoolProperty(name="settings save enabled", default=True) # type: ignore
auto_export: BoolProperty( auto_export: BoolProperty(
name='Auto export', name='Auto export',
@ -119,7 +118,6 @@ class AutoExportSettings(PropertyGroup):
self.settings_save_enabled = False # we disable auto_saving of our settings self.settings_save_enabled = False # we disable auto_saving of our settings
try: try:
for setting in settings: for setting in settings:
print("setting", setting, settings[setting])
setattr(self, setting, settings[setting]) setattr(self, setting, settings[setting])
except: pass except: pass
# TODO: remove setting if there was a failure # TODO: remove setting if there was a failure

View File

@ -1,7 +1,7 @@
import os import os
import bpy import bpy
from bpy_types import (PropertyGroup) from bpy_types import (PropertyGroup)
from bpy.props import (EnumProperty, PointerProperty, StringProperty, BoolProperty, CollectionProperty, IntProperty) from bpy.props import (EnumProperty, PointerProperty, StringProperty, BoolProperty, CollectionProperty, FloatProperty)
from blenvy.settings import load_settings, upsert_settings, generate_complete_settings_dict from blenvy.settings import load_settings, upsert_settings, generate_complete_settings_dict
from .propGroups.prop_groups import generate_propertyGroups_for_components from .propGroups.prop_groups import generate_propertyGroups_for_components
from .components.metadata import ensure_metadata_for_all_items from .components.metadata import ensure_metadata_for_all_items
@ -18,7 +18,6 @@ def save_settings(settings, context):
# helper function to deal with timer # helper function to deal with timer
def toggle_watcher(self, context): def toggle_watcher(self, context):
#print("toggling watcher", self.watcher_enabled, watch_schema, self, bpy.app.timers)
if not self.watcher_enabled: if not self.watcher_enabled:
try: try:
bpy.app.timers.unregister(watch_schema) bpy.app.timers.unregister(watch_schema)
@ -76,12 +75,12 @@ class ComponentsSettings(PropertyGroup):
watcher_enabled: BoolProperty(name="Watcher_enabled", default=True, update=toggle_watcher)# type: ignore watcher_enabled: BoolProperty(name="Watcher_enabled", default=True, update=toggle_watcher)# type: ignore
watcher_active: BoolProperty(name = "Flag for watcher status", default = False)# type: ignore watcher_active: BoolProperty(name = "Flag for watcher status", default = False)# type: ignore
watcher_poll_frequency: IntProperty( watcher_poll_frequency: FloatProperty(
name="watcher poll frequency", name="watcher poll frequency",
description="frequency (s) at wich to poll for changes to the registry file", description="frequency (s) at wich to poll for changes to the registry file",
min=1, min=1.0,
max=10, max=10.0,
default=1, default=1.0,
update=save_settings update=save_settings
)# type: ignore )# type: ignore
@ -134,7 +133,6 @@ class ComponentsSettings(PropertyGroup):
self.settings_save_enabled = False # we disable auto_saving of our settings self.settings_save_enabled = False # we disable auto_saving of our settings
try: try:
for setting in settings: for setting in settings:
print("setting", setting, settings[setting])
setattr(self, setting, settings[setting]) setattr(self, setting, settings[setting])
except:pass except:pass
try: try:

View File

@ -19,14 +19,13 @@ def scan_assets(scene, blueprints_data, settings):
for blueprint_name in blueprint_instance_names_for_scene: for blueprint_name in blueprint_instance_names_for_scene:
blueprint = blueprints_data.blueprints_per_name.get(blueprint_name, None) blueprint = blueprints_data.blueprints_per_name.get(blueprint_name, None)
if blueprint is not None: if blueprint is not None:
print("BLUEPRINT", blueprint) #print("BLUEPRINT", blueprint)
blueprint_exported_path = None blueprint_exported_path = None
if blueprint.local: if blueprint.local:
blueprint_exported_path = os.path.join(relative_blueprints_path, f"{blueprint.name}{export_gltf_extension}") blueprint_exported_path = os.path.join(relative_blueprints_path, f"{blueprint.name}{export_gltf_extension}")
else: else:
# get the injected path of the external blueprints # get the injected path of the external blueprints
blueprint_exported_path = blueprint.collection['Export_path'] if 'Export_path' in blueprint.collection else None blueprint_exported_path = blueprint.collection['Export_path'] if 'Export_path' in blueprint.collection else None
print("foo", dict(blueprint.collection))
if blueprint_exported_path is not None: if blueprint_exported_path is not None:
blueprint_assets_list.append({"name": blueprint.name, "path": blueprint_exported_path}) blueprint_assets_list.append({"name": blueprint.name, "path": blueprint_exported_path})
@ -45,7 +44,7 @@ def scan_assets(scene, blueprints_data, settings):
assets_list_name = f"assets_{scene.name}" assets_list_name = f"assets_{scene.name}"
assets_list_data = {"blueprints": json.dumps(blueprint_assets_list), "sounds":[], "images":[]} assets_list_data = {"blueprints": json.dumps(blueprint_assets_list), "sounds":[], "images":[]}
print("blueprint assets", blueprint_assets_list) #print("blueprint assets", blueprint_assets_list)
def get_userTextures(): def get_userTextures():

View File

@ -42,9 +42,7 @@ def draw_assets(layout, name, title, asset_registry, target_type, target_name, e
if editable: if editable:
row = panel.row() row = panel.row()
#panel.separator() #panel.separator()
print("here", user_assets)
for asset in user_assets: for asset in user_assets:
print("asset", asset)
row = panel.row() row = panel.row()
split = row.split(factor=nesting_indent) split = row.split(factor=nesting_indent)
col = split.column() col = split.column()

View File

@ -17,13 +17,12 @@ def find_blueprints_not_on_disk(blueprints, folder_path, extension):
def check_if_blueprint_on_disk(scene_name, folder_path, extension): def check_if_blueprint_on_disk(scene_name, folder_path, extension):
gltf_output_path = os.path.join(folder_path, scene_name + extension) gltf_output_path = os.path.join(folder_path, scene_name + extension)
found = os.path.exists(gltf_output_path) and os.path.isfile(gltf_output_path) found = os.path.exists(gltf_output_path) and os.path.isfile(gltf_output_path)
print("level", scene_name, "found", found, "path", gltf_output_path)
return found return found
def inject_export_path_into_internal_blueprints(internal_blueprints, blueprints_path, gltf_extension): def inject_export_path_into_internal_blueprints(internal_blueprints, blueprints_path, gltf_extension):
for blueprint in internal_blueprints: for blueprint in internal_blueprints:
blueprint_exported_path = os.path.join(blueprints_path, f"{blueprint.name}{gltf_extension}") blueprint_exported_path = os.path.join(blueprints_path, f"{blueprint.name}{gltf_extension}")
print("injecting blueprint path", blueprint_exported_path, "for", blueprint.name) # print("injecting blueprint path", blueprint_exported_path, "for", blueprint.name)
blueprint.collection["export_path"] = blueprint_exported_path blueprint.collection["export_path"] = blueprint_exported_path
@ -44,14 +43,14 @@ def inject_blueprints_list_into_main_scene(scene, blueprints_data, settings):
for blueprint_name in blueprint_instance_names_for_scene: for blueprint_name in blueprint_instance_names_for_scene:
blueprint = blueprints_data.blueprints_per_name.get(blueprint_name, None) blueprint = blueprints_data.blueprints_per_name.get(blueprint_name, None)
if blueprint is not None: if blueprint is not None:
print("BLUEPRINT", blueprint) #print("BLUEPRINT", blueprint)
blueprint_exported_path = None blueprint_exported_path = None
if blueprint.local: if blueprint.local:
blueprint_exported_path = os.path.join(blueprints_path, f"{blueprint.name}{export_gltf_extension}") blueprint_exported_path = os.path.join(blueprints_path, f"{blueprint.name}{export_gltf_extension}")
else: else:
# get the injected path of the external blueprints # get the injected path of the external blueprints
blueprint_exported_path = blueprint.collection['Export_path'] if 'Export_path' in blueprint.collection else None blueprint_exported_path = blueprint.collection['Export_path'] if 'Export_path' in blueprint.collection else None
print("foo", dict(blueprint.collection)) #print("foo", dict(blueprint.collection))
if blueprint_exported_path is not None: if blueprint_exported_path is not None:
blueprint_assets_list.append({"name": blueprint.name, "path": blueprint_exported_path, "type": "MODEL", "internal": True}) blueprint_assets_list.append({"name": blueprint.name, "path": blueprint_exported_path, "type": "MODEL", "internal": True})
@ -61,9 +60,7 @@ def inject_blueprints_list_into_main_scene(scene, blueprints_data, settings):
assets_list_name = f"assets_{scene.name}" assets_list_name = f"assets_{scene.name}"
scene["assets"] = json.dumps(blueprint_assets_list) scene["assets"] = json.dumps(blueprint_assets_list)
print("blueprint assets", blueprint_assets_list) #print("blueprint assets", blueprint_assets_list)
"""add_scene_property(scene, assets_list_name, assets_list_data)
"""
def remove_blueprints_list_from_main_scene(scene): def remove_blueprints_list_from_main_scene(scene):
assets_list = None assets_list = None

View File

@ -7,31 +7,25 @@ import blenvy.add_ons.auto_export.settings as auto_export_settings
import blenvy.add_ons.bevy_components.settings as component_settings import blenvy.add_ons.bevy_components.settings as component_settings
# list of settings we do NOT want to save # list of settings we do NOT want to save
settings_black_list = ['settings_save_enabled', 'main_scene_selector', 'main_scenes', 'main_scenes_index', 'library_scene_selector', 'library_scenes', 'library_scenes_index', settings_black_list = ['settings_save_enabled', 'main_scene_selector', 'library_scene_selector']
#'project_root_path_full', 'assets_path_full', ''
]
def save_settings(settings, context): def save_settings(settings, context):
if settings.settings_save_enabled: if settings.settings_save_enabled:
settings_dict = generate_complete_settings_dict(settings, BlenvyManager, []) settings_dict = generate_complete_settings_dict(settings, BlenvyManager, [])
print("save settings", settings, context, settings_dict) raw_settings = {key: settings_dict[key] for key in settings_dict.keys() if key not in settings_black_list}
# upsert_settings(settings.settings_save_path, {key: settings_dict[key] for key in settings_dict.keys() if key not in settings_black_list}) # we need to inject the main & library scene names as they are computed properties, not blender ones
raw_settings['main_scenes_names'] = settings.main_scenes_names
raw_settings['library_scenes_names'] = settings.library_scenes_names
upsert_settings(settings.settings_save_path, raw_settings, overwrite=True)
def update_scene_lists(blenvy, context): def update_asset_folders(settings, context):
blenvy.main_scene_names = [scene.name for scene in blenvy.main_scenes] # FIXME: unsure
blenvy.library_scene_names = [scene.name for scene in blenvy.library_scenes] # FIXME: unsure
upsert_settings(blenvy.settings_save_path, {"main_scene_names": [scene.name for scene in blenvy.main_scenes]})
upsert_settings(blenvy.settings_save_path, {"library_scene_names": [scene.name for scene in blenvy.library_scenes]})
def update_asset_folders(blenvy, context):
asset_path_names = ['project_root_path', 'assets_path', 'blueprints_path', 'levels_path', 'materials_path'] asset_path_names = ['project_root_path', 'assets_path', 'blueprints_path', 'levels_path', 'materials_path']
for asset_path_name in asset_path_names: for asset_path_name in asset_path_names:
upsert_settings(blenvy.settings_save_path, {asset_path_name: getattr(blenvy, asset_path_name)}) upsert_settings(settings.settings_save_path, {asset_path_name: getattr(settings, asset_path_name)})
settings_dict = generate_complete_settings_dict(settings, BlenvyManager, [])
upsert_settings(settings.settings_save_path, {key: settings_dict[key] for key in settings_dict.keys() if key not in settings_black_list}, overwrite=True)
def update_mode(blenvy, context):
upsert_settings(blenvy.settings_save_path, {"mode": blenvy.mode })
def is_scene_already_in_use(self, scene): def is_scene_already_in_use(self, scene):
try: try:
@ -45,7 +39,7 @@ def is_scene_already_in_use(self, scene):
class BlenvyManager(PropertyGroup): class BlenvyManager(PropertyGroup):
settings_save_path = ".blenvy_common_settings" # where to store data in bpy.texts settings_save_path = ".blenvy_common_settings" # where to store data in bpy.texts
settings_save_enabled = BoolProperty(name="settings save enabled", default=True) settings_save_enabled: BoolProperty(name="settings save enabled", default=True) # type: ignore
mode: EnumProperty( mode: EnumProperty(
items=( items=(
@ -56,14 +50,15 @@ class BlenvyManager(PropertyGroup):
('SETTINGS', "Settings", ""), ('SETTINGS', "Settings", ""),
('TOOLS', "Tools", ""), ('TOOLS', "Tools", ""),
), ),
update=update_mode default="SETTINGS",
update=save_settings
) # type: ignore ) # type: ignore
project_root_path: StringProperty( project_root_path: StringProperty(
name = "Project Root Path", name = "Project Root Path",
description="The root folder of your (Bevy) project (not assets!)", description="The root folder of your (Bevy) project (not assets!)",
default='../', default='../',
update= update_asset_folders update= save_settings
) # type: ignore ) # type: ignore
# computed property for the absolute path of assets # computed property for the absolute path of assets
@ -76,7 +71,7 @@ class BlenvyManager(PropertyGroup):
description='The root folder for all exports(relative to the root folder/path) Defaults to "assets" ', description='The root folder for all exports(relative to the root folder/path) Defaults to "assets" ',
default='./assets', default='./assets',
options={'HIDDEN'}, options={'HIDDEN'},
update= update_asset_folders update= save_settings
) # type: ignore ) # type: ignore
# computed property for the absolute path of assets # computed property for the absolute path of assets
@ -88,7 +83,7 @@ class BlenvyManager(PropertyGroup):
name='Blueprints path', name='Blueprints path',
description='path to export the blueprints to (relative to the assets folder)', description='path to export the blueprints to (relative to the assets folder)',
default='blueprints', default='blueprints',
update= update_asset_folders update= save_settings
) # type: ignore ) # type: ignore
# computed property for the absolute path of blueprints # computed property for the absolute path of blueprints
@ -100,7 +95,7 @@ class BlenvyManager(PropertyGroup):
name='Levels path', name='Levels path',
description='path to export the levels (main scenes) to (relative to the assets folder)', description='path to export the levels (main scenes) to (relative to the assets folder)',
default='levels', default='levels',
update= update_asset_folders update= save_settings
) # type: ignore ) # type: ignore
# computed property for the absolute path of blueprints # computed property for the absolute path of blueprints
@ -112,7 +107,7 @@ class BlenvyManager(PropertyGroup):
name='Materials path', name='Materials path',
description='path to export the materials libraries to (relative to the assets folder)', description='path to export the materials libraries to (relative to the assets folder)',
default='materials', default='materials',
update= update_asset_folders update= save_settings
) # type: ignore ) # type: ignore
# computed property for the absolute path of blueprints # computed property for the absolute path of blueprints
@ -124,8 +119,8 @@ class BlenvyManager(PropertyGroup):
auto_export: PointerProperty(type=auto_export_settings.AutoExportSettings) # type: ignore auto_export: PointerProperty(type=auto_export_settings.AutoExportSettings) # type: ignore
components: PointerProperty(type=component_settings.ComponentsSettings) # type: ignore components: PointerProperty(type=component_settings.ComponentsSettings) # type: ignore
main_scene_selector: PointerProperty(type=bpy.types.Scene, name="main scene", description="main_scene_picker", poll=is_scene_already_in_use)# type: ignore main_scene_selector: PointerProperty(type=bpy.types.Scene, name="main scene", description="main_scene_picker", poll=is_scene_already_in_use, update=save_settings)# type: ignore
library_scene_selector: PointerProperty(type=bpy.types.Scene, name="library scene", description="library_scene_picker", poll=is_scene_already_in_use)# type: ignore library_scene_selector: PointerProperty(type=bpy.types.Scene, name="library scene", description="library_scene_picker", poll=is_scene_already_in_use, update=save_settings)# type: ignore
@property @property
def main_scenes(self): def main_scenes(self):
@ -171,13 +166,21 @@ class BlenvyManager(PropertyGroup):
print("LOAD SETTINGS") print("LOAD SETTINGS")
settings = load_settings(self.settings_save_path) settings = load_settings(self.settings_save_path)
if settings is not None: if settings is not None:
if "mode" in settings: self.settings_save_enabled = False # we disable auto_saving of our settings
try:
for setting in settings:
print("setting", setting, settings[setting])
setattr(self, setting, settings[setting])
except:pass
"""if "mode" in settings:
self.mode = settings["mode"] self.mode = settings["mode"]
asset_path_names = ['project_root_path', 'assets_path', 'blueprints_path', 'levels_path', 'materials_path'] asset_path_names = ['project_root_path', 'assets_path', 'blueprints_path', 'levels_path', 'materials_path']
for asset_path_name in asset_path_names: for asset_path_name in asset_path_names:
if asset_path_name in settings: if asset_path_name in settings:
setattr(self, asset_path_name, settings[asset_path_name]) setattr(self, asset_path_name, settings[asset_path_name])"""
self.settings_save_enabled = True
# now load auto_export settings # now load auto_export settings
self.auto_export.load_settings() self.auto_export.load_settings()

View File

@ -28,6 +28,8 @@ class BLENVY_OT_scenes_list_actions(Operator):
def invoke(self, context, event): def invoke(self, context, event):
if self.action == 'REMOVE': if self.action == 'REMOVE':
bpy.data.scenes[self.scene_name].blenvy_scene_type = 'None' bpy.data.scenes[self.scene_name].blenvy_scene_type = 'None'
context.window_manager.blenvy.main_scene_selector = None # we use these to force update/save the list of main/library scenes
context.window_manager.blenvy.library_scene_selector = None # we use these to force update/save the list of main/library scenes
"""info = 'Item "%s" removed from list' % (target[idx].name) """info = 'Item "%s" removed from list' % (target[idx].name)
target.remove(idx) target.remove(idx)
@ -49,9 +51,9 @@ class BLENVY_OT_scenes_list_actions(Operator):
print("adding scene", scene_to_add) print("adding scene", scene_to_add)
if self.scene_type == "LEVEL": if self.scene_type == "LEVEL":
context.window_manager.blenvy.main_scene_selector = None context.window_manager.blenvy.main_scene_selector = None # we use these to force update/save the list of main/library scenes
else: else:
context.window_manager.blenvy.library_scene_selector = None context.window_manager.blenvy.library_scene_selector = None # we use these to force update/save the list of main/library scenes
#setattr(source, target_index, len(target) - 1) #setattr(source, target_index, len(target) - 1)

View File

@ -70,7 +70,6 @@ def are_settings_identical(old, new, white_list=None):
if old is not None and new is None: if old is not None and new is None:
return False return False
#print("TUTU", old_items, new_items)
old_items = sorted(old.items()) old_items = sorted(old.items())
new_items = sorted(new.items()) new_items = sorted(new.items())
@ -86,3 +85,33 @@ def are_settings_identical(old, new, white_list=None):
new_items = sorted(new_items_override.items()) new_items = sorted(new_items_override.items())
return old_items == new_items return old_items == new_items
# if one of the changed settings is not in the white list, it gets discarded
def changed_settings(old, new, white_list=[]):
if old is None and new is None:
return []
if old is None and new is not None:
return new.keys()
if old is not None and new is None:
return []
old_items = sorted(old.items())
new_items = sorted(new.items())
result = []
old_keys = list(old.keys())
new_keys =list(new.keys())
added = list(set(new_keys) - set(old_keys))
removed = list(set(old_keys) - set(new_keys))
result += added
result += removed
for key in new.keys():
if key in old:
if new[key] != old[key]:
result.append(key)
return [key for key in list(set(result)) if key in white_list]

View File

@ -121,9 +121,9 @@ def test_export_complex(setup_data):
user_asset.path = "audio/fake.mp3" user_asset.path = "audio/fake.mp3"
# we have to cheat, since we cannot rely on the data injected when saving the library file # we have to cheat, since we cannot rely on the data injected when saving the library file
bpy.data.collections["External_blueprint"]["export_path"] = "blueprints/External_blueprint.glb" #bpy.data.collections["External_blueprint"]["export_path"] = "blueprints/External_blueprint.glb"
bpy.data.collections["External_blueprint2"]["export_path"] = "blueprints/External_blueprint2.glb" #bpy.data.collections["External_blueprint2"]["export_path"] = "blueprints/External_blueprint2.glb"
bpy.data.collections["External_blueprint3"]["export_path"] = "blueprints/External_blueprint3.glb" #bpy.data.collections["External_blueprint3"]["export_path"] = "blueprints/External_blueprint3.glb"
prepare_and_export() prepare_and_export()