Compare commits

..

No commits in common. "ee14ec69bd88da7f135a3ec3100e128d91c7cbc5" and "6bc0ec078232b20c2d864bd32872d59e3dece7a2" have entirely different histories.

82 changed files with 3757 additions and 356 deletions

View File

@ -93,7 +93,6 @@ Thanks to all the contributors helping out with this project ! Big kudos to you,
- [yukkop](https://github.com/yukkop) - [yukkop](https://github.com/yukkop)
- [killercup](https://github.com/killercup) - [killercup](https://github.com/killercup)
- [janhohenheim ](https://github.com/janhohenheim) - [janhohenheim ](https://github.com/janhohenheim)
- [BUGO07](https://github.com/BUGO07)
## License ## License

12
TODO.md
View File

@ -112,6 +112,7 @@ General issues:
- they normally need/have unique export paths (otherwise, user error, perhaps show it ?) - they normally need/have unique export paths (otherwise, user error, perhaps show it ?)
- perhaps a simple hashing of the parent's path would be enought - perhaps a simple hashing of the parent's path would be enought
- [x] addon-prefs => settings - [x] addon-prefs => settings
- [x] generate_gltf_export_settings => should not use add-on prefs at all ? since we are not overriding gltf settings that way anymore ? - [x] generate_gltf_export_settings => should not use add-on prefs at all ? since we are not overriding gltf settings that way anymore ?
- [x] remove hard coded path for standard gltf settings - [x] remove hard coded path for standard gltf settings
@ -120,6 +121,7 @@ General issues:
- [x] components - [x] components
- [x] add handling of errors when trying to load settings - [x] add handling of errors when trying to load settings
- [x] fix auto export workflow - [x] fix auto export workflow
- [x] add hashing of modifiers/ geometry nodes in serialize scene - [x] add hashing of modifiers/ geometry nodes in serialize scene
- [x] add ability to FORCE export specific blueprints & levels - [x] add ability to FORCE export specific blueprints & levels
@ -238,10 +240,6 @@ Blender side:
- [ ] blueprint instances as children of blueprint instances - [ ] blueprint instances as children of blueprint instances
- [ ] blueprint instances as children of empties - [ ] blueprint instances as children of empties
- [x] check/ fix behaviour of blender plugin if all folders are the same (ie, all in assets for example)
- [x] rename all "main scene xx" to "level scene"
- [x] make sure the "add scene" button is not available unless you have actually selected one
- [x] make auto export be on by default, however bail out early by detecting if there are any level/blueprint scenes
Bevy Side: Bevy Side:
- [x] deprecate BlueprintName & BlueprintPath & use BlueprintInfo instead - [x] deprecate BlueprintName & BlueprintPath & use BlueprintInfo instead
@ -311,9 +309,9 @@ Bevy Side:
- [ ] replace string in BlueprintInfo path with PathBuf ? - [ ] replace string in BlueprintInfo path with PathBuf ?
- [ ] update main docs - [ ] update main docs
- [x] rename project to Blenvy - [ ] rename project to Blenvy
- [ ] replace all references to the old 2 add-ons with those to Blenvy - [ ] replace all references to the old 2 add-ons with those to Blenvy
- [x] rename repo to "Blenvy" - [ ] rename repo to "Blenvy"
- [x] do a deprecation release of all bevy_gltf_xxx crates to point at the new Blenvy crate - [ ] do a deprecation release of all bevy_gltf_xxx crates to point at the new Blenvy crate
clear && pytest -svv --blender-template ../../testing/bevy_example/art/testing_library.blend --blender-executable /home/ckaos/tools/blender/blender-4.1.0-linux-x64/blender tests/test_bevy_integration_prepare.py && pytest -svv --blender-executable /home/ckaos/tools/blender/blender-4.1.0-linux-x64/blender tests/test_bevy_integration.py clear && pytest -svv --blender-template ../../testing/bevy_example/art/testing_library.blend --blender-executable /home/ckaos/tools/blender/blender-4.1.0-linux-x64/blender tests/test_bevy_integration_prepare.py && pytest -svv --blender-executable /home/ckaos/tools/blender/blender-4.1.0-linux-x64/blender tests/test_bevy_integration.py

View File

@ -0,0 +1,21 @@
[package]
name = "bevy_gltf_blueprints"
version = "0.11.0"
authors = ["Mark 'kaosat-dev' Moissette"]
description = "Adds the ability to define Blueprints/Prefabs for Bevy inside gltf files and spawn them in Bevy."
homepage = "https://github.com/kaosat-dev/Blender_bevy_components_workflow"
repository = "https://github.com/kaosat-dev/Blender_bevy_components_workflow"
keywords = ["gamedev", "bevy", "gltf", "blueprint", "prefab"]
categories = ["game-development"]
edition = "2021"
license = "MIT OR Apache-2.0"
[lints]
workspace = true
[dependencies]
bevy_gltf_components = { version = "0.6", path = "../bevy_gltf_components" }
bevy = { version = "0.14", default-features = false, features = ["bevy_asset", "bevy_scene", "bevy_gltf", "bevy_animation", "animation"] }
[dev-dependencies]
bevy = { version = "0.14", default-features = false, features = ["dynamic_linking"] }

View File

@ -0,0 +1,4 @@
This crate is available under either:
* The [MIT License](./LICENSE_MIT)
* The [Apache License, Version 2.0](./LICENSE_APACHE)

View File

@ -0,0 +1,201 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [2023] [Mark "kaosat-dev" Moissette]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@ -0,0 +1,21 @@
MIT License
Copyright (c) 2023 Mark "kaosat-dev" Moissette
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View File

@ -0,0 +1,61 @@
use bevy::{math::Vec3A, prelude::*, render::primitives::Aabb};
use crate::{BluePrintsConfig, Spawned};
/// helper system that computes the compound aabbs of the scenes/blueprints
pub fn compute_scene_aabbs(
root_entities: Query<(Entity, &Name), (With<Spawned>, Without<Aabb>)>,
children: Query<&Children>,
existing_aabbs: Query<&Aabb>,
mut blueprints_config: ResMut<BluePrintsConfig>,
mut commands: Commands,
) {
// compute compound aabb
for (root_entity, name) in root_entities.iter() {
// info!("generating aabb for {:?}", name);
// only recompute aabb if it has not already been done before
if blueprints_config.aabb_cache.contains_key(&name.to_string()) {
let aabb = blueprints_config
.aabb_cache
.get(&name.to_string())
.expect("we should have the aabb available");
commands.entity(root_entity).insert(*aabb);
} else {
let aabb = compute_descendant_aabb(root_entity, &children, &existing_aabbs);
commands.entity(root_entity).insert(aabb);
blueprints_config.aabb_cache.insert(name.to_string(), aabb);
}
}
}
pub fn compute_descendant_aabb(
root_entity: Entity,
children: &Query<&Children>,
existing_aabbs: &Query<&Aabb>,
) -> Aabb {
if let Ok(children_list) = children.get(root_entity) {
let mut chilren_aabbs: Vec<Aabb> = vec![];
for child in children_list.iter() {
if let Ok(aabb) = existing_aabbs.get(*child) {
chilren_aabbs.push(*aabb);
} else {
let aabb = compute_descendant_aabb(*child, children, existing_aabbs);
chilren_aabbs.push(aabb);
}
}
let mut min = Vec3A::splat(f32::MAX);
let mut max = Vec3A::splat(f32::MIN);
for aabb in chilren_aabbs.iter() {
min = min.min(aabb.min());
max = max.max(aabb.max());
}
let aabb = Aabb::from_min_max(Vec3::from(min), Vec3::from(max));
return aabb;
}
Aabb::default()
}

View File

@ -0,0 +1,18 @@
use bevy::prelude::*;
use bevy::utils::HashMap;
#[derive(Component, Reflect, Default, Debug)]
#[reflect(Component)]
/// storage for animations for a given entity (hierarchy), essentially a clone of gltf's `named_animations`
pub struct Animations {
pub named_animations: HashMap<String, Handle<AnimationClip>>,
pub named_indices: HashMap<String, AnimationNodeIndex>,
pub graph: Handle<AnimationGraph>,
}
#[derive(Component, Debug)]
/// Stop gap helper component : this is inserted into a "root" entity (an entity representing a whole gltf file)
/// so that the root entity knows which of its children contains an actualy `AnimationPlayer` component
/// this is for convenience, because currently , Bevy's gltf parsing inserts `AnimationPlayers` "one level down"
/// ie armature/root for animated models, which means more complex queries to trigger animations that we want to avoid
pub struct AnimationPlayerLink(pub Entity);

View File

@ -0,0 +1,60 @@
use std::path::{Path, PathBuf};
use bevy::{asset::LoadedUntypedAsset, gltf::Gltf, prelude::*, utils::HashMap};
use crate::{BluePrintsConfig, BlueprintAnimations};
/// helper component, is used to store the list of sub blueprints to enable automatic loading of dependend blueprints
#[derive(Component, Reflect, Default, Debug)]
#[reflect(Component)]
pub struct BlueprintAsset {
pub name: String,
pub path: String,
}
/// helper component, is used to store the list of sub blueprints to enable automatic loading of dependend blueprints
#[derive(Component, Reflect, Default, Debug)]
#[reflect(Component)]
pub struct LocalAssets(pub Vec<BlueprintAsset>);
/// helper component, is used to store the list of sub blueprints to enable automatic loading of dependend blueprints
#[derive(Component, Reflect, Default, Debug)]
#[reflect(Component)]
pub struct BlueprintAssets(pub Vec<BlueprintAsset>);
////////////////////////
///
/// flag component, usually added when a blueprint is loaded
#[derive(Component)]
pub(crate) struct BlueprintAssetsLoaded;
/// flag component
#[derive(Component)]
pub(crate) struct BlueprintAssetsNotLoaded;
/// helper component, for tracking loaded assets's loading state, id , handle etc
#[derive(Debug)]
pub(crate) struct AssetLoadTracker {
#[allow(dead_code)]
pub name: String,
pub id: AssetId<LoadedUntypedAsset>,
pub loaded: bool,
#[allow(dead_code)]
pub handle: Handle<LoadedUntypedAsset>,
}
/// helper component, for tracking loaded assets
#[derive(Component, Debug)]
pub(crate) struct AssetsToLoad {
pub all_loaded: bool,
pub asset_infos: Vec<AssetLoadTracker>,
pub progress: f32,
}
impl Default for AssetsToLoad {
fn default() -> Self {
Self {
all_loaded: Default::default(),
asset_infos: Default::default(),
progress: Default::default(),
}
}
}

View File

@ -0,0 +1,106 @@
use bevy::{ecs::world::Command, prelude::*};
use std::any::TypeId;
// originally based https://github.com/bevyengine/bevy/issues/1515,
// more specifically https://gist.github.com/nwtnni/85d6b87ae75337a522166c500c9a8418
// to work with Bevy 0.11
// to copy components between entities but NOT overwriting any existing components
// plus some bells & whistles
pub struct CopyComponents {
pub source: Entity,
pub destination: Entity,
pub exclude: Vec<TypeId>,
pub stringent: bool,
}
impl CopyComponents {
// Copy all components from an entity to another.
// Using an entity with no components as the destination creates a copy of the source entity.
// Panics if:
// - the components are not registered in the type registry,
// - the world does not have a type registry
// - the source or destination entity do not exist
fn transfer_components(self, world: &mut World) {
let components = {
let registry = world
.get_resource::<AppTypeRegistry>()
.expect("the world should have a type registry")
.read();
world
.get_entity(self.source)
.expect("source entity should exist")
.archetype()
.components()
.filter_map(|component_id| {
let component_info = world
.components()
.get_info(component_id)
.expect("component info should be available");
let type_id = component_info.type_id().unwrap();
if self.exclude.contains(&type_id) {
debug!("excluding component: {:?}", component_info.name());
None
} else {
debug!(
"cloning: component: {:?} {:?}",
component_info.name(),
type_id
);
if let Some(type_registration) = registry.get(type_id) {
Some(type_registration)
} else if self.stringent {
return Some(registry.get(type_id).unwrap_or_else(|| {
panic!(
"cannot clone entity: component: {:?} is not registered",
component_info.name()
)
}));
} else {
warn!(
"cannot clone component: component: {:?} is not registered",
component_info.name()
);
None
}
}
})
.map(|type_id| {
return (
type_id.data::<ReflectComponent>().unwrap().clone(),
type_id.type_info().type_id(), // we need the original type_id down the line
);
})
.collect::<Vec<_>>()
};
for (component, type_id) in components {
let type_registry: &AppTypeRegistry = world.resource();
let type_registry = type_registry.clone();
let type_registry = type_registry.read();
let source = component
.reflect(world.get_entity(self.source).unwrap())
.unwrap()
.clone_value();
let mut destination = world
.get_entity_mut(self.destination)
.expect("destination entity should exist");
// println!("contains typeid {:?} {}", type_id, destination.contains_type_id(type_id));
// we only want to copy components that are NOT already in the destination (ie no overwriting existing components)
if !destination.contains_type_id(type_id) {
component.insert(&mut destination, &*source, &type_registry);
}
}
}
}
// This allows the command to be used in systems
impl Command for CopyComponents {
fn apply(self, world: &mut World) {
self.transfer_components(world);
}
}

View File

@ -0,0 +1,180 @@
pub mod spawn_from_blueprints;
pub use spawn_from_blueprints::*;
pub mod spawn_post_process;
pub(crate) use spawn_post_process::*;
pub mod animation;
pub use animation::*;
pub mod aabb;
pub use aabb::*;
pub mod materials;
pub use materials::*;
pub mod copy_components;
pub use copy_components::*;
use core::fmt;
use std::path::PathBuf;
use bevy::{
prelude::*,
render::{primitives::Aabb, view::VisibilitySystems},
utils::HashMap,
};
use bevy_gltf_components::{ComponentsFromGltfPlugin, GltfComponentsSet};
#[derive(SystemSet, Debug, Hash, PartialEq, Eq, Clone)]
/// set for the two stages of blueprint based spawning :
pub enum GltfBlueprintsSet {
Spawn,
AfterSpawn,
}
#[derive(Bundle)]
pub struct BluePrintBundle {
pub blueprint: BlueprintName,
pub spawn_here: SpawnHere,
}
impl Default for BluePrintBundle {
fn default() -> Self {
BluePrintBundle {
blueprint: BlueprintName("default".into()),
spawn_here: SpawnHere,
}
}
}
#[derive(Clone, Resource)]
pub struct BluePrintsConfig {
pub(crate) format: GltfFormat,
pub(crate) library_folder: PathBuf,
pub(crate) aabbs: bool,
pub(crate) aabb_cache: HashMap<String, Aabb>, // cache for aabbs
pub(crate) material_library: bool,
pub(crate) material_library_folder: PathBuf,
pub(crate) material_library_cache: HashMap<String, Handle<StandardMaterial>>,
}
#[derive(Debug, Clone, Copy, Eq, PartialEq, Hash, Default)]
pub enum GltfFormat {
#[default]
GLB,
GLTF,
}
impl fmt::Display for GltfFormat {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self {
GltfFormat::GLB => {
write!(f, "glb",)
}
GltfFormat::GLTF => {
write!(f, "gltf")
}
}
}
}
#[derive(Debug, Clone)]
/// Plugin for gltf blueprints
pub struct BlueprintsPlugin {
pub legacy_mode: bool, // flag that gets passed on to bevy_gltf_components
pub format: GltfFormat,
/// The base folder where library/blueprints assets are loaded from, relative to the executable.
pub library_folder: PathBuf,
/// Automatically generate aabbs for the blueprints root objects
pub aabbs: bool,
///
pub material_library: bool,
pub material_library_folder: PathBuf,
}
impl Default for BlueprintsPlugin {
fn default() -> Self {
Self {
legacy_mode: true,
format: GltfFormat::GLB,
library_folder: PathBuf::from("models/library"),
aabbs: false,
material_library: false,
material_library_folder: PathBuf::from("materials"),
}
}
}
fn aabbs_enabled(blueprints_config: Res<BluePrintsConfig>) -> bool {
blueprints_config.aabbs
}
fn materials_library_enabled(blueprints_config: Res<BluePrintsConfig>) -> bool {
blueprints_config.material_library
}
impl Plugin for BlueprintsPlugin {
fn build(&self, app: &mut App) {
app.add_plugins(ComponentsFromGltfPlugin {
legacy_mode: self.legacy_mode,
})
.register_type::<BlueprintName>()
.register_type::<MaterialInfo>()
.register_type::<SpawnHere>()
.register_type::<Animations>()
.register_type::<BlueprintsList>()
.register_type::<Vec<String>>()
.register_type::<HashMap<String, Vec<String>>>()
.insert_resource(BluePrintsConfig {
format: self.format,
library_folder: self.library_folder.clone(),
aabbs: self.aabbs,
aabb_cache: HashMap::new(),
material_library: self.material_library,
material_library_folder: self.material_library_folder.clone(),
material_library_cache: HashMap::new(),
})
.configure_sets(
Update,
(GltfBlueprintsSet::Spawn, GltfBlueprintsSet::AfterSpawn)
.chain()
.after(GltfComponentsSet::Injection),
)
.add_systems(
Update,
(
(
prepare_blueprints,
check_for_loaded,
spawn_from_blueprints,
apply_deferred,
)
.chain(),
(compute_scene_aabbs, apply_deferred)
.chain()
.run_if(aabbs_enabled),
apply_deferred,
(
materials_inject,
check_for_material_loaded,
materials_inject2,
)
.chain()
.run_if(materials_library_enabled),
)
.chain()
.in_set(GltfBlueprintsSet::Spawn),
)
.add_systems(
PostUpdate,
(spawned_blueprint_post_process, apply_deferred)
.chain()
.in_set(GltfBlueprintsSet::AfterSpawn)
.before(VisibilitySystems::CheckVisibility),
);
}
}

View File

@ -0,0 +1,201 @@
use std::path::Path;
use bevy::{
asset::{AssetServer, Assets, Handle},
ecs::{
component::Component,
entity::Entity,
query::{Added, With},
reflect::ReflectComponent,
system::{Commands, Query, Res, ResMut},
},
gltf::Gltf,
hierarchy::{Children, Parent},
log::debug,
pbr::StandardMaterial,
reflect::Reflect,
render::mesh::Mesh,
};
use crate::{AssetLoadTracker, AssetsToLoad, BluePrintsConfig};
#[derive(Component, Reflect, Default, Debug)]
#[reflect(Component)]
/// struct containing the name & source of the material to apply
pub struct MaterialInfo {
pub name: String,
pub source: String,
}
/// flag component
#[derive(Component)]
pub(crate) struct BlueprintMaterialAssetsLoaded;
/// flag component
#[derive(Component)]
pub(crate) struct BlueprintMaterialAssetsNotLoaded;
/// system that injects / replaces materials from material library
pub(crate) fn materials_inject(
blueprints_config: ResMut<BluePrintsConfig>,
material_infos: Query<(Entity, &MaterialInfo), Added<MaterialInfo>>,
asset_server: Res<AssetServer>,
mut commands: Commands,
) {
for (entity, material_info) in material_infos.iter() {
let model_file_name = format!(
"{}_materials_library.{}",
&material_info.source, &blueprints_config.format
);
let materials_path = Path::new(&blueprints_config.material_library_folder)
.join(Path::new(model_file_name.as_str()));
let material_name = &material_info.name;
let material_full_path = materials_path.to_str().unwrap().to_string() + "#" + material_name; // TODO: yikes, cleanup
if blueprints_config
.material_library_cache
.contains_key(&material_full_path)
{
debug!("material is cached, retrieving");
blueprints_config
.material_library_cache
.get(&material_full_path)
.expect("we should have the material available");
commands
.entity(entity)
.insert(BlueprintMaterialAssetsLoaded);
} else {
let material_file_handle: Handle<Gltf> = asset_server.load(materials_path.clone());
let material_file_id = material_file_handle.id();
let asset_infos: Vec<AssetLoadTracker<Gltf>> = vec![AssetLoadTracker {
name: material_full_path,
id: material_file_id,
loaded: false,
handle: material_file_handle.clone(),
}];
commands
.entity(entity)
.insert(AssetsToLoad {
all_loaded: false,
asset_infos,
..Default::default()
})
.insert(BlueprintMaterialAssetsNotLoaded);
/**/
}
}
}
// TODO, merge with check_for_loaded, make generic ?
pub(crate) fn check_for_material_loaded(
mut blueprint_assets_to_load: Query<
(Entity, &mut AssetsToLoad<Gltf>),
With<BlueprintMaterialAssetsNotLoaded>,
>,
asset_server: Res<AssetServer>,
mut commands: Commands,
) {
for (entity, mut assets_to_load) in blueprint_assets_to_load.iter_mut() {
let mut all_loaded = true;
let mut loaded_amount = 0;
let total = assets_to_load.asset_infos.len();
for tracker in assets_to_load.asset_infos.iter_mut() {
let asset_id = tracker.id;
let loaded = asset_server.is_loaded_with_dependencies(asset_id);
tracker.loaded = loaded;
if loaded {
loaded_amount += 1;
} else {
all_loaded = false;
}
}
let progress: f32 = loaded_amount as f32 / total as f32;
assets_to_load.progress = progress;
if all_loaded {
assets_to_load.all_loaded = true;
commands
.entity(entity)
.insert(BlueprintMaterialAssetsLoaded)
.remove::<BlueprintMaterialAssetsNotLoaded>();
}
}
}
/// system that injects / replaces materials from material library
pub(crate) fn materials_inject2(
mut blueprints_config: ResMut<BluePrintsConfig>,
material_infos: Query<
(&MaterialInfo, &Children),
(
Added<BlueprintMaterialAssetsLoaded>,
With<BlueprintMaterialAssetsLoaded>,
),
>,
with_materials_and_meshes: Query<
(),
(
With<Parent>,
With<Handle<StandardMaterial>>,
With<Handle<Mesh>>,
),
>,
assets_gltf: Res<Assets<Gltf>>,
asset_server: Res<AssetServer>,
mut commands: Commands,
) {
for (material_info, children) in material_infos.iter() {
let model_file_name = format!(
"{}_materials_library.{}",
&material_info.source, &blueprints_config.format
);
let materials_path = Path::new(&blueprints_config.material_library_folder)
.join(Path::new(model_file_name.as_str()));
let material_name = &material_info.name;
let material_full_path = materials_path.to_str().unwrap().to_string() + "#" + material_name; // TODO: yikes, cleanup
let mut material_found: Option<&Handle<StandardMaterial>> = None;
if blueprints_config
.material_library_cache
.contains_key(&material_full_path)
{
debug!("material is cached, retrieving");
let material = blueprints_config
.material_library_cache
.get(&material_full_path)
.expect("we should have the material available");
material_found = Some(material);
} else {
let model_handle: Handle<Gltf> = asset_server.load(materials_path.clone()); // FIXME: kinda weird now
let mat_gltf = assets_gltf
.get(model_handle.id())
.expect("material should have been preloaded");
if mat_gltf.named_materials.contains_key(material_name as &str) {
let material = mat_gltf
.named_materials
.get(material_name as &str)
.expect("this material should have been loaded");
blueprints_config
.material_library_cache
.insert(material_full_path, material.clone());
material_found = Some(material);
}
}
if let Some(material) = material_found {
for child in children.iter() {
if with_materials_and_meshes.contains(*child) {
debug!(
"injecting material {}, path: {:?}",
material_name,
materials_path.clone()
);
commands.entity(*child).insert(material.clone());
}
}
}
}
}

View File

@ -0,0 +1,248 @@
/// helper component, for tracking loaded assets's loading state, id , handle etc
#[derive(Default, Debug)]
pub(crate) struct AssetLoadTracker<T: bevy::prelude::Asset> {
#[allow(dead_code)]
pub name: String,
pub id: AssetId<T>,
pub loaded: bool,
#[allow(dead_code)]
pub handle: Handle<T>,
}
/// helper component, for tracking loaded assets
#[derive(Component, Debug)]
pub(crate) struct AssetsToLoad<T: bevy::prelude::Asset> {
pub all_loaded: bool,
pub asset_infos: Vec<AssetLoadTracker<T>>,
pub progress: f32,
}
impl<T: bevy::prelude::Asset> Default for AssetsToLoad<T> {
fn default() -> Self {
Self {
all_loaded: Default::default(),
asset_infos: Default::default(),
progress: Default::default(),
}
}
}
/// flag component, usually added when a blueprint is loaded
#[derive(Component)]
pub(crate) struct BlueprintAssetsLoaded;
/// flag component
#[derive(Component)]
pub(crate) struct BlueprintAssetsNotLoaded;
/// spawning prepare function,
/// * also takes into account the already exisiting "override" components, ie "override components" > components from blueprint
pub(crate) fn prepare_blueprints(
spawn_placeholders: Query<
(
Entity,
&BlueprintName,
Option<&Parent>,
Option<&Library>,
Option<&Name>,
Option<&BlueprintsList>,
),
(Added<BlueprintName>, Added<SpawnHere>, Without<Spawned>),
>,
mut commands: Commands,
asset_server: Res<AssetServer>,
blueprints_config: Res<BluePrintsConfig>,
) {
for (entity, blupeprint_name, original_parent, library_override, name, blueprints_list) in
spawn_placeholders.iter()
{
debug!(
"requesting to spawn {:?} for entity {:?}, id: {:?}, parent:{:?}",
blupeprint_name.0, name, entity, original_parent
);
// println!("main model path {:?}", model_path);
if blueprints_list.is_some() {
let blueprints_list = blueprints_list.unwrap();
// println!("blueprints list {:?}", blueprints_list.0.keys());
let mut asset_infos: Vec<AssetLoadTracker<Gltf>> = vec![];
let library_path =
library_override.map_or_else(|| &blueprints_config.library_folder, |l| &l.0);
for (blueprint_name, _) in blueprints_list.0.iter() {
let model_file_name = format!("{}.{}", &blueprint_name, &blueprints_config.format);
let model_path = Path::new(&library_path).join(Path::new(model_file_name.as_str()));
let model_handle: Handle<Gltf> = asset_server.load(model_path.clone());
let model_id = model_handle.id();
let loaded = asset_server.is_loaded_with_dependencies(model_id);
if !loaded {
asset_infos.push(AssetLoadTracker {
name: model_path.to_string_lossy().into(),
id: model_id,
loaded: false,
handle: model_handle.clone(),
});
}
}
// if not all assets are already loaded, inject a component to signal that we need them to be loaded
if !asset_infos.is_empty() {
commands
.entity(entity)
.insert(AssetsToLoad {
all_loaded: false,
asset_infos,
..Default::default()
})
.insert(BlueprintAssetsNotLoaded);
} else {
commands.entity(entity).insert(BlueprintAssetsLoaded);
}
} else {
// in case there are no blueprintsList, we revert back to the old behaviour
commands.entity(entity).insert(BlueprintAssetsLoaded);
}
}
}
pub(crate) fn check_for_loaded(
mut blueprint_assets_to_load: Query<
(Entity, &mut AssetsToLoad<Gltf>),
With<BlueprintAssetsNotLoaded>,
>,
asset_server: Res<AssetServer>,
mut commands: Commands,
) {
for (entity, mut assets_to_load) in blueprint_assets_to_load.iter_mut() {
let mut all_loaded = true;
let mut loaded_amount = 0;
let total = assets_to_load.asset_infos.len();
for tracker in assets_to_load.asset_infos.iter_mut() {
let asset_id = tracker.id;
let loaded = asset_server.is_loaded_with_dependencies(asset_id);
tracker.loaded = loaded;
if loaded {
loaded_amount += 1;
} else {
all_loaded = false;
}
}
let progress: f32 = loaded_amount as f32 / total as f32;
// println!("progress: {}",progress);
assets_to_load.progress = progress;
if all_loaded {
assets_to_load.all_loaded = true;
commands
.entity(entity)
.insert(BlueprintAssetsLoaded)
.remove::<BlueprintAssetsNotLoaded>();
}
}
}
pub(crate) fn spawn_from_blueprints(
spawn_placeholders: Query<
(
Entity,
&BlueprintName,
Option<&Transform>,
Option<&Parent>,
Option<&Library>,
Option<&AddToGameWorld>,
Option<&Name>,
),
(
With<BlueprintAssetsLoaded>,
Added<BlueprintAssetsLoaded>,
Without<BlueprintAssetsNotLoaded>,
),
>,
mut commands: Commands,
mut game_world: Query<Entity, With<GameWorldTag>>,
assets_gltf: Res<Assets<Gltf>>,
asset_server: Res<AssetServer>,
blueprints_config: Res<BluePrintsConfig>,
children: Query<&Children>,
) {
for (
entity,
blupeprint_name,
transform,
original_parent,
library_override,
add_to_world,
name,
) in spawn_placeholders.iter()
{
debug!(
"attempting to spawn {:?} for entity {:?}, id: {:?}, parent:{:?}",
blupeprint_name.0, name, entity, original_parent
);
let what = &blupeprint_name.0;
let model_file_name = format!("{}.{}", &what, &blueprints_config.format);
// library path is either defined at the plugin level or overriden by optional Library components
let library_path =
library_override.map_or_else(|| &blueprints_config.library_folder, |l| &l.0);
let model_path = Path::new(&library_path).join(Path::new(model_file_name.as_str()));
// info!("attempting to spawn {:?}", model_path);
let model_handle: Handle<Gltf> = asset_server.load(model_path.clone()); // FIXME: kinda weird now
let gltf = assets_gltf.get(&model_handle).unwrap_or_else(|| {
panic!(
"gltf file {:?} should have been loaded",
model_path.to_str()
)
});
// WARNING we work under the assumtion that there is ONLY ONE named scene, and that the first one is the right one
let main_scene_name = gltf
.named_scenes
.keys()
.next()
.expect("there should be at least one named scene in the gltf file to spawn");
let scene = &gltf.named_scenes[main_scene_name];
// transforms are optional, but still deal with them correctly
let mut transforms: Transform = Transform::default();
if transform.is_some() {
transforms = *transform.unwrap();
}
let mut original_children: Vec<Entity> = vec![];
if let Ok(c) = children.get(entity) {
for child in c.iter() {
original_children.push(*child);
}
}
commands.entity(entity).insert((
SceneBundle {
scene: scene.clone(),
transform: transforms,
..Default::default()
},
Spawned,
OriginalChildren(original_children),
BlueprintAnimations {
// these are animations specific to the inside of the blueprint
named_animations: gltf.named_animations.clone(),
},
));
if add_to_world.is_some() {
let world = game_world
.get_single_mut()
.expect("there should be a game world present");
commands.entity(world).add_child(entity);
}
}
}

View File

@ -0,0 +1,311 @@
use std::path::{Path, PathBuf};
use bevy::{gltf::Gltf, prelude::*, utils::HashMap};
use crate::{Animations, BluePrintsConfig};
/// this is a flag component for our levels/game world
#[derive(Component)]
pub struct GameWorldTag;
/// Main component for the blueprints
#[derive(Component, Reflect, Default, Debug)]
#[reflect(Component)]
pub struct BlueprintName(pub String);
/// flag component needed to signify the intent to spawn a Blueprint
#[derive(Component, Reflect, Default, Debug)]
#[reflect(Component)]
pub struct SpawnHere;
#[derive(Component)]
/// flag component for dynamically spawned scenes
pub struct Spawned;
#[derive(Component, Reflect, Default, Debug)]
#[reflect(Component)]
/// flag component marking any spwaned child of blueprints ..unless the original entity was marked with the `NoInBlueprint` marker component
pub struct InBlueprint;
#[derive(Component, Reflect, Default, Debug)]
#[reflect(Component)]
/// flag component preventing any spawned child of blueprints to be marked with the `InBlueprint` component
pub struct NoInBlueprint;
#[derive(Component, Reflect, Default, Debug)]
#[reflect(Component)]
// this allows overriding the default library path for a given entity/blueprint
pub struct Library(pub PathBuf);
#[derive(Component, Reflect, Default, Debug)]
#[reflect(Component)]
/// flag component to force adding newly spawned entity as child of game world
pub struct AddToGameWorld;
#[derive(Component)]
/// helper component, just to transfer child data
pub(crate) struct OriginalChildren(pub Vec<Entity>);
/// helper component, is used to store the list of sub blueprints to enable automatic loading of dependend blueprints
#[derive(Component, Reflect, Default, Debug)]
#[reflect(Component)]
pub struct BlueprintsList(pub HashMap<String, Vec<String>>);
/// helper component, for tracking loaded assets's loading state, id , handle etc
#[derive(Default, Debug)]
pub(crate) struct AssetLoadTracker<T: bevy::prelude::Asset> {
#[allow(dead_code)]
pub name: String,
pub id: AssetId<T>,
pub loaded: bool,
#[allow(dead_code)]
pub handle: Handle<T>,
}
/// helper component, for tracking loaded assets
#[derive(Component, Debug)]
pub(crate) struct AssetsToLoad<T: bevy::prelude::Asset> {
pub all_loaded: bool,
pub asset_infos: Vec<AssetLoadTracker<T>>,
pub progress: f32,
}
impl<T: bevy::prelude::Asset> Default for AssetsToLoad<T> {
fn default() -> Self {
Self {
all_loaded: Default::default(),
asset_infos: Default::default(),
progress: Default::default(),
}
}
}
/// flag component, usually added when a blueprint is loaded
#[derive(Component)]
pub(crate) struct BlueprintAssetsLoaded;
/// flag component
#[derive(Component)]
pub(crate) struct BlueprintAssetsNotLoaded;
/// spawning prepare function,
/// * also takes into account the already exisiting "override" components, ie "override components" > components from blueprint
pub(crate) fn prepare_blueprints(
spawn_placeholders: Query<
(
Entity,
&BlueprintName,
Option<&Parent>,
Option<&Library>,
Option<&Name>,
Option<&BlueprintsList>,
),
(Added<BlueprintName>, Added<SpawnHere>, Without<Spawned>),
>,
mut commands: Commands,
asset_server: Res<AssetServer>,
blueprints_config: Res<BluePrintsConfig>,
) {
for (entity, blupeprint_name, original_parent, library_override, name, blueprints_list) in
spawn_placeholders.iter()
{
debug!(
"requesting to spawn {:?} for entity {:?}, id: {:?}, parent:{:?}",
blupeprint_name.0, name, entity, original_parent
);
// println!("main model path {:?}", model_path);
if blueprints_list.is_some() {
let blueprints_list = blueprints_list.unwrap();
// println!("blueprints list {:?}", blueprints_list.0.keys());
let mut asset_infos: Vec<AssetLoadTracker<Gltf>> = vec![];
let library_path =
library_override.map_or_else(|| &blueprints_config.library_folder, |l| &l.0);
for (blueprint_name, _) in blueprints_list.0.iter() {
let model_file_name = format!("{}.{}", &blueprint_name, &blueprints_config.format);
let model_path = Path::new(&library_path).join(Path::new(model_file_name.as_str()));
let model_handle: Handle<Gltf> = asset_server.load(model_path.clone());
let model_id = model_handle.id();
let loaded = asset_server.is_loaded_with_dependencies(model_id);
if !loaded {
asset_infos.push(AssetLoadTracker {
name: model_path.to_string_lossy().into(),
id: model_id,
loaded: false,
handle: model_handle.clone(),
});
}
}
// if not all assets are already loaded, inject a component to signal that we need them to be loaded
if !asset_infos.is_empty() {
commands
.entity(entity)
.insert(AssetsToLoad {
all_loaded: false,
asset_infos,
..Default::default()
})
.insert(BlueprintAssetsNotLoaded);
} else {
commands.entity(entity).insert(BlueprintAssetsLoaded);
}
} else {
// in case there are no blueprintsList, we revert back to the old behaviour
commands.entity(entity).insert(BlueprintAssetsLoaded);
}
}
}
pub(crate) fn check_for_loaded(
mut blueprint_assets_to_load: Query<
(Entity, &mut AssetsToLoad<Gltf>),
With<BlueprintAssetsNotLoaded>,
>,
asset_server: Res<AssetServer>,
mut commands: Commands,
) {
for (entity, mut assets_to_load) in blueprint_assets_to_load.iter_mut() {
let mut all_loaded = true;
let mut loaded_amount = 0;
let total = assets_to_load.asset_infos.len();
for tracker in assets_to_load.asset_infos.iter_mut() {
let asset_id = tracker.id;
let loaded = asset_server.is_loaded_with_dependencies(asset_id);
tracker.loaded = loaded;
if loaded {
loaded_amount += 1;
} else {
all_loaded = false;
}
}
let progress: f32 = loaded_amount as f32 / total as f32;
// println!("progress: {}",progress);
assets_to_load.progress = progress;
if all_loaded {
assets_to_load.all_loaded = true;
commands
.entity(entity)
.insert(BlueprintAssetsLoaded)
.remove::<BlueprintAssetsNotLoaded>();
}
}
}
pub(crate) fn spawn_from_blueprints(
spawn_placeholders: Query<
(
Entity,
&BlueprintName,
Option<&Transform>,
Option<&Parent>,
Option<&Library>,
Option<&AddToGameWorld>,
Option<&Name>,
),
(
With<BlueprintAssetsLoaded>,
Added<BlueprintAssetsLoaded>,
Without<BlueprintAssetsNotLoaded>,
),
>,
mut commands: Commands,
mut game_world: Query<Entity, With<GameWorldTag>>,
assets_gltf: Res<Assets<Gltf>>,
mut graphs: ResMut<Assets<AnimationGraph>>,
asset_server: Res<AssetServer>,
blueprints_config: Res<BluePrintsConfig>,
children: Query<&Children>,
) {
for (
entity,
blupeprint_name,
transform,
original_parent,
library_override,
add_to_world,
name,
) in spawn_placeholders.iter()
{
debug!(
"attempting to spawn {:?} for entity {:?}, id: {:?}, parent:{:?}",
blupeprint_name.0, name, entity, original_parent
);
let what = &blupeprint_name.0;
let model_file_name = format!("{}.{}", &what, &blueprints_config.format);
// library path is either defined at the plugin level or overriden by optional Library components
let library_path =
library_override.map_or_else(|| &blueprints_config.library_folder, |l| &l.0);
let model_path = Path::new(&library_path).join(Path::new(model_file_name.as_str()));
// info!("attempting to spawn {:?}", model_path);
let model_handle: Handle<Gltf> = asset_server.load(model_path.clone()); // FIXME: kinda weird now
let gltf = assets_gltf.get(&model_handle).unwrap_or_else(|| {
panic!(
"gltf file {:?} should have been loaded",
model_path.to_str()
)
});
// WARNING we work under the assumtion that there is ONLY ONE named scene, and that the first one is the right one
let main_scene_name = gltf
.named_scenes
.keys()
.next()
.expect("there should be at least one named scene in the gltf file to spawn");
let scene = &gltf.named_scenes[main_scene_name];
// transforms are optional, but still deal with them correctly
let mut transforms: Transform = Transform::default();
if transform.is_some() {
transforms = *transform.unwrap();
}
let mut original_children: Vec<Entity> = vec![];
if let Ok(c) = children.get(entity) {
for child in c.iter() {
original_children.push(*child);
}
}
let mut graph = AnimationGraph::new();
let mut named_animations: HashMap<String, Handle<AnimationClip>> = HashMap::new();
let mut named_indices: HashMap<String, AnimationNodeIndex> = HashMap::new();
for (key, clip) in gltf.named_animations.iter() {
named_animations.insert(key.to_string(), clip.clone());
let animation_index = graph.add_clip(clip.clone(), 1.0, graph.root);
named_indices.insert(key.to_string(), animation_index);
}
let graph = graphs.add(graph);
commands.entity(entity).insert((
SceneBundle {
scene: scene.clone(),
transform: transforms,
..Default::default()
},
Animations {
named_animations,
named_indices,
graph
},
Spawned,
OriginalChildren(original_children),
));
if add_to_world.is_some() {
let world = game_world
.get_single_mut()
.expect("there should be a game world present");
commands.entity(world).add_child(entity);
}
}
}

View File

@ -0,0 +1,100 @@
use std::any::TypeId;
use bevy::gltf::Gltf;
use bevy::prelude::*;
use bevy::scene::SceneInstance;
// use bevy::utils::hashbrown::HashSet;
use super::{AnimationPlayerLink, Animations};
use super::{SpawnHere, Spawned};
use crate::{
AssetsToLoad, BlueprintAssetsLoaded, CopyComponents, InBlueprint, NoInBlueprint,
OriginalChildren,
};
/// this system is in charge of doing any necessary post processing after a blueprint scene has been spawned
/// - it removes one level of useless nesting
/// - it copies the blueprint's root components to the entity it was spawned on (original entity)
/// - it copies the children of the blueprint scene into the original entity
/// - it add `AnimationLink` components so that animations can be controlled from the original entity
/// - it cleans up/ removes a few , by then uneeded components
pub(crate) fn spawned_blueprint_post_process(
unprocessed_entities: Query<
(
Entity,
&Children,
&OriginalChildren,
&Animations,
Option<&NoInBlueprint>,
Option<&Name>,
),
(With<SpawnHere>, With<SceneInstance>, With<Spawned>),
>,
added_animation_players: Query<(Entity, &Parent), Added<AnimationPlayer>>,
all_children: Query<&Children>,
mut commands: Commands,
) {
for (original, children, original_children, animations, no_inblueprint, name) in
unprocessed_entities.iter()
{
debug!("post processing blueprint for entity {:?}", name);
if children.len() == 0 {
warn!("timing issue ! no children found, please restart your bevy app (bug being investigated)");
continue;
}
// the root node is the first & normally only child inside a scene, it is the one that has all relevant components
let mut root_entity = Entity::PLACEHOLDER; //FIXME: and what about childless ones ?? => should not be possible normally
// let diff = HashSet::from_iter(original_children.0).difference(HashSet::from_iter(children));
// we find the first child that was not in the entity before (aka added during the scene spawning)
for c in children.iter() {
if !original_children.0.contains(c) {
root_entity = *c;
break;
}
}
// we flag all children of the blueprint instance with 'InBlueprint'
// can be usefull to filter out anything that came from blueprints vs normal children
if no_inblueprint.is_none() {
for child in all_children.iter_descendants(root_entity) {
commands.entity(child).insert(InBlueprint);
}
}
// copy components into from blueprint instance's root_entity to original entity
commands.add(CopyComponents {
source: root_entity,
destination: original,
exclude: vec![TypeId::of::<Parent>(), TypeId::of::<Children>()],
stringent: false,
});
// we move all of children of the blueprint instance one level to the original entity
if let Ok(root_entity_children) = all_children.get(root_entity) {
for child in root_entity_children.iter() {
// info!("copying child {:?} upward from {:?} to {:?}", names.get(*child), root_entity, original);
commands.entity(original).add_child(*child);
}
}
if animations.named_animations.keys().len() > 0 {
for (added, parent) in added_animation_players.iter() {
if parent.get() == root_entity {
// FIXME: stopgap solution: since we cannot use an AnimationPlayer at the root entity level
// and we cannot update animation clips so that the EntityPaths point to one level deeper,
// BUT we still want to have some marker/control at the root entity level, we add this
commands.entity(original).insert(AnimationPlayerLink(added));
}
}
}
commands.entity(original).remove::<SpawnHere>();
commands.entity(original).remove::<Spawned>();
commands.entity(original).remove::<Handle<Scene>>();
commands.entity(original).remove::<AssetsToLoad<Gltf>>(); // also clear the sub assets tracker to free up handles, perhaps just freeing up the handles and leave the rest would be better ?
commands.entity(original).remove::<BlueprintAssetsLoaded>();
commands.entity(root_entity).despawn_recursive();
}
}

View File

@ -0,0 +1,22 @@
[package]
name = "bevy_gltf_components"
version = "0.6.0"
authors = ["Mark 'kaosat-dev' Moissette"]
description = "Allows you to define Bevy components direclty inside gltf files and instanciate the components on the Bevy side."
homepage = "https://github.com/kaosat-dev/Blender_bevy_components_workflow"
repository = "https://github.com/kaosat-dev/Blender_bevy_components_workflow"
keywords = ["gamedev", "bevy", "assets", "gltf", "components"]
categories = ["game-development"]
edition = "2021"
license = "MIT OR Apache-2.0"
[lints]
workspace = true
[dependencies]
bevy = { version = "0.14", default-features = false, features = ["bevy_asset", "bevy_scene", "bevy_gltf"] }
serde = "1.0.188"
ron = "0.8.1"
[dev-dependencies]
bevy = { version = "0.14", default-features = false, features = ["dynamic_linking"] }

View File

@ -0,0 +1,4 @@
This crate is available under either:
* The [MIT License](./LICENSE_MIT)
* The [Apache License, Version 2.0](./LICENSE_APACHE)

View File

@ -0,0 +1,201 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [2023] [Mark "kaosat-dev" Moissette]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@ -0,0 +1,21 @@
MIT License
Copyright (c) 2023 Mark "kaosat-dev" Moissette
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View File

@ -0,0 +1,8 @@
use bevy::prelude::*;
mod lighting;
pub use lighting::*;
pub(crate) fn plugin(app: &mut App) {
app.add_plugins(lighting::plugin);
}

View File

@ -0,0 +1,97 @@
use bevy::pbr::DirectionalLightShadowMap;
use bevy::prelude::*;
use crate::GltfComponentsSet;
pub(crate) fn plugin(app: &mut App) {
app.register_type::<BlenderBackgroundShader>()
.register_type::<BlenderShadowSettings>()
.register_type::<BlenderLightShadows>()
.add_systems(
Update,
(process_lights, process_shadowmap, process_background_shader)
.after(GltfComponentsSet::Injection),
);
}
#[derive(Component, Reflect, Default, Debug, PartialEq, Clone)]
#[reflect(Component)]
#[non_exhaustive]
/// The properties of a light's shadow , to enable controlling per light shadows from Blender
pub struct BlenderLightShadows {
pub enabled: bool,
pub buffer_bias: f32,
}
/// The background color as described by Blender's [background shader](https://docs.blender.org/manual/en/latest/render/shader_nodes/shader/background.html).
#[derive(Component, Reflect, Default, Debug, PartialEq, Clone)]
#[reflect(Component)]
#[non_exhaustive]
pub struct BlenderBackgroundShader {
pub color: Color,
pub strength: f32,
}
/// The settings used by EEVEE's [shadow rendering](https://docs.blender.org/manual/en/latest/render/eevee/render_settings/shadows.html).
#[derive(Component, Reflect, Default, Debug, PartialEq, Clone)]
#[reflect(Component)]
#[non_exhaustive]
pub struct BlenderShadowSettings {
pub cascade_size: usize,
}
fn process_lights(
mut directional_lights: Query<
(&mut DirectionalLight, Option<&BlenderLightShadows>),
Added<DirectionalLight>,
>,
mut spot_lights: Query<(&mut SpotLight, Option<&BlenderLightShadows>), Added<SpotLight>>,
mut point_lights: Query<(&mut PointLight, Option<&BlenderLightShadows>), Added<PointLight>>,
) {
for (mut light, blender_light_shadows) in directional_lights.iter_mut() {
if let Some(blender_light_shadows) = blender_light_shadows {
light.shadows_enabled = blender_light_shadows.enabled;
} else {
light.shadows_enabled = true;
}
}
for (mut light, blender_light_shadows) in spot_lights.iter_mut() {
if let Some(blender_light_shadows) = blender_light_shadows {
light.shadows_enabled = blender_light_shadows.enabled;
} else {
light.shadows_enabled = true;
}
}
for (mut light, blender_light_shadows) in point_lights.iter_mut() {
if let Some(blender_light_shadows) = blender_light_shadows {
light.shadows_enabled = blender_light_shadows.enabled;
} else {
light.shadows_enabled = true;
}
}
}
fn process_shadowmap(
shadowmaps: Query<&BlenderShadowSettings, Added<BlenderShadowSettings>>,
mut commands: Commands,
) {
for shadowmap in shadowmaps.iter() {
commands.insert_resource(DirectionalLightShadowMap {
size: shadowmap.cascade_size,
});
}
}
fn process_background_shader(
background_shaders: Query<&BlenderBackgroundShader, Added<BlenderBackgroundShader>>,
mut commands: Commands,
) {
for background_shader in background_shaders.iter() {
commands.insert_resource(AmbientLight {
color: background_shader.color,
// Just a guess, see <https://github.com/bevyengine/bevy/issues/12280>
brightness: background_shader.strength * 400.0,
});
}
}

View File

@ -0,0 +1,102 @@
pub mod utils;
pub use utils::*;
pub mod ronstring_to_reflect_component;
pub use ronstring_to_reflect_component::*;
pub mod process_gltfs;
pub use process_gltfs::*;
pub mod blender_settings;
use bevy::{
app::Startup,
ecs::{
component::Component,
reflect::ReflectComponent,
system::{Res, Resource},
},
log::warn,
prelude::{App, IntoSystemConfigs, Plugin, SystemSet, Update},
reflect::Reflect,
};
/// A Bevy plugin for extracting components from gltf files and automatically adding them to the relevant entities
/// It will automatically run every time you load a gltf file
/// Add this plugin to your Bevy app to get access to this feature
/// ```
/// # use bevy::prelude::*;
/// # use bevy::gltf::*;
/// # use bevy_gltf_components::ComponentsFromGltfPlugin;
///
/// //too barebones of an example to be meaningfull, please see https://github.com/kaosat-dev/Blender_bevy_components_workflow/examples/basic for a real example
/// fn main() {
/// App::new()
/// .add_plugins(DefaultPlugins)
/// .add_plugin(ComponentsFromGltfPlugin)
/// .add_system(spawn_level)
/// .run();
/// }
///
/// fn spawn_level(
/// asset_server: Res<AssetServer>,
/// mut commands: bevy::prelude::Commands,
/// keycode: Res<Input<KeyCode>>,
/// ){
/// if keycode.just_pressed(KeyCode::Return) {
/// commands.spawn(SceneBundle {
/// scene: asset_server.load("basic/models/level1.glb"),
/// transform: Transform::from_xyz(2.0, 0.0, -5.0),
/// ..Default::default()
/// });
/// }
///}
/// ```
/// this is a flag component to tag a processed gltf, to avoid processing things multiple times
#[derive(Component, Reflect, Default, Debug)]
#[reflect(Component)]
pub struct GltfProcessed;
#[derive(SystemSet, Debug, Hash, PartialEq, Eq, Clone)]
/// systemset to order your systems after the component injection when needed
pub enum GltfComponentsSet {
Injection,
}
#[derive(Clone, Resource)]
pub struct GltfComponentsConfig {
pub(crate) legacy_mode: bool,
}
pub struct ComponentsFromGltfPlugin {
pub legacy_mode: bool,
}
impl Default for ComponentsFromGltfPlugin {
fn default() -> Self {
Self { legacy_mode: true }
}
}
fn check_for_legacy_mode(gltf_components_config: Res<GltfComponentsConfig>) {
if gltf_components_config.legacy_mode {
warn!("using simplified component definitions is deprecated since 0.3, prefer defining components with real ron values (use the bevy_components tool for Blender for simplicity) ");
}
}
impl Plugin for ComponentsFromGltfPlugin {
fn build(&self, app: &mut App) {
app.add_plugins(blender_settings::plugin)
.register_type::<GltfProcessed>()
.insert_resource(GltfComponentsConfig {
legacy_mode: self.legacy_mode,
})
.add_systems(Startup, check_for_legacy_mode)
.add_systems(
Update,
(add_components_from_gltf_extras).in_set(GltfComponentsSet::Injection),
);
}
}

View File

@ -0,0 +1,97 @@
use bevy::{
core::Name,
ecs::{
entity::Entity,
query::{Added, Without},
reflect::{AppTypeRegistry, ReflectComponent},
world::World,
},
gltf::GltfExtras,
hierarchy::Parent,
log::debug,
reflect::{Reflect, TypeRegistration},
utils::HashMap,
};
use crate::{ronstring_to_reflect_component, GltfComponentsConfig, GltfProcessed};
/// main function: injects components into each entity in gltf files that have `gltf_extras`, using reflection
pub fn add_components_from_gltf_extras(world: &mut World) {
let mut extras =
world.query_filtered::<(Entity, &Name, &GltfExtras, &Parent), (Added<GltfExtras>, Without<GltfProcessed>)>();
let mut entity_components: HashMap<Entity, Vec<(Box<dyn Reflect>, TypeRegistration)>> =
HashMap::new();
let gltf_components_config = world.resource::<GltfComponentsConfig>();
for (entity, name, extra, parent) in extras.iter(world) {
debug!(
"Name: {}, entity {:?}, parent: {:?}, extras {:?}",
name, entity, parent, extra
);
let type_registry: &AppTypeRegistry = world.resource();
let type_registry = type_registry.read();
let reflect_components = ronstring_to_reflect_component(
&extra.value,
&type_registry,
gltf_components_config.legacy_mode,
);
// we assign the components specified /xxx_components objects to their parent node
let mut target_entity = entity;
// if the node contains "components" or ends with "_pa" (ie add to parent), the components will not be added to the entity itself but to its parent
// this is mostly used for Blender collections
if name.as_str().contains("components") || name.as_str().ends_with("_pa") {
debug!("adding components to parent");
target_entity = parent.get();
}
debug!("adding to {:?}", target_entity);
// if there where already components set to be added to this entity (for example when entity_data was refering to a parent), update the vec of entity_components accordingly
// this allows for example blender collection to provide basic ecs data & the instances to override/ define their own values
if entity_components.contains_key(&target_entity) {
let mut updated_components: Vec<(Box<dyn Reflect>, TypeRegistration)> = Vec::new();
let current_components = &entity_components[&target_entity];
// first inject the current components
for (component, type_registration) in current_components {
updated_components.push((component.clone_value(), type_registration.clone()));
}
// then inject the new components: this also enables overwrite components set in the collection
for (component, type_registration) in reflect_components {
updated_components.push((component.clone_value(), type_registration));
}
entity_components.insert(target_entity, updated_components);
} else {
entity_components.insert(target_entity, reflect_components);
}
}
for (entity, components) in entity_components {
let type_registry: &AppTypeRegistry = world.resource();
let type_registry = type_registry.clone();
let type_registry = type_registry.read();
if !components.is_empty() {
debug!("--entity {:?}, components {}", entity, components.len());
}
for (component, type_registration) in components {
debug!(
"------adding {} {:?}",
component.get_represented_type_info().unwrap().type_path(),
component
);
{
let mut entity_mut = world.entity_mut(entity);
type_registration
.data::<ReflectComponent>()
.expect("Unable to reflect component")
.insert(&mut entity_mut, &*component, &type_registry);
entity_mut.insert(GltfProcessed); // this is how can we insert any additional components
}
}
}
}

View File

@ -0,0 +1,134 @@
use bevy::log::{debug, warn};
use bevy::reflect::serde::ReflectDeserializer;
use bevy::reflect::{Reflect, TypeInfo, TypeRegistration, TypeRegistry};
use bevy::utils::HashMap;
use ron::Value;
use serde::de::DeserializeSeed;
use super::capitalize_first_letter;
pub fn ronstring_to_reflect_component(
ron_string: &str,
type_registry: &TypeRegistry,
simplified_types: bool,
) -> Vec<(Box<dyn Reflect>, TypeRegistration)> {
let lookup: HashMap<String, Value> = ron::from_str(ron_string).unwrap();
let mut components: Vec<(Box<dyn Reflect>, TypeRegistration)> = Vec::new();
for (key, value) in lookup.into_iter() {
let type_string = key.replace("component: ", "").trim().to_string();
let capitalized_type_name = capitalize_first_letter(type_string.as_str());
let mut parsed_value: String;
match value.clone() {
Value::String(str) => {
parsed_value = str;
}
_ => parsed_value = ron::to_string(&value).unwrap().to_string(),
}
if let Some(type_registration) =
type_registry.get_with_short_type_path(capitalized_type_name.as_str())
{
debug!("TYPE INFO {:?}", type_registration.type_info());
if simplified_types {
if let TypeInfo::TupleStruct(info) = type_registration.type_info() {
// we handle tupple strucs with only one field differently, as Blender's custom properties with custom ui (float, int, bool, etc) always give us a tupple struct
if info.field_len() == 1 {
let field = info
.field_at(0)
.expect("we should always have at least one field here");
let field_name = field.type_path();
let mut formated = parsed_value.clone();
match field_name {
"f32" => {
formated = parsed_value.parse::<f32>().unwrap().to_string();
}
"f64" => {
formated = parsed_value.parse::<f64>().unwrap().to_string();
}
"u8" => {
formated = parsed_value.parse::<u8>().unwrap().to_string();
}
"u16" => {
formated = parsed_value.parse::<u16>().unwrap().to_string();
}
"u32" => {
formated = parsed_value.parse::<u32>().unwrap().to_string();
}
"u64" => {
formated = parsed_value.parse::<u64>().unwrap().to_string();
}
"u128" => {
formated = parsed_value.parse::<u128>().unwrap().to_string();
}
"glam::Vec2" => {
let parsed: Vec<f32> = ron::from_str(&parsed_value).unwrap();
formated = format!("(x:{},y:{})", parsed[0], parsed[1]);
}
"glam::Vec3" => {
let parsed: Vec<f32> = ron::from_str(&parsed_value).unwrap();
formated =
format!("(x:{},y:{},z:{})", parsed[0], parsed[1], parsed[2]);
}
"bevy_render::color::Color" => {
let parsed: Vec<f32> = ron::from_str(&parsed_value).unwrap();
if parsed.len() == 3 {
formated = format!(
"Rgba(red:{},green:{},blue:{}, alpha: 1.0)",
parsed[0], parsed[1], parsed[2]
);
}
if parsed.len() == 4 {
formated = format!(
"Rgba(red:{},green:{},blue:{}, alpha:{})",
parsed[0], parsed[1], parsed[2], parsed[3]
);
}
}
_ => {}
}
parsed_value = format!("({formated})");
}
}
if parsed_value.is_empty() {
parsed_value = "()".to_string();
}
}
let ron_string = format!(
"{{ \"{}\":{} }}",
type_registration.type_info().type_path(),
parsed_value
);
// usefull to determine what an entity looks like Serialized
/*let test_struct = CameraRenderGraph::new("name");
let serializer = ReflectSerializer::new(&test_struct, &type_registry);
let serialized =
ron::ser::to_string_pretty(&serializer, ron::ser::PrettyConfig::default()).unwrap();
println!("serialized Component {}", serialized);*/
debug!("component data ron string {}", ron_string);
let mut deserializer = ron::Deserializer::from_str(ron_string.as_str())
.expect("deserialzer should have been generated from string");
let reflect_deserializer = ReflectDeserializer::new(type_registry);
let component = reflect_deserializer
.deserialize(&mut deserializer)
.unwrap_or_else(|_| {
panic!(
"failed to deserialize component {} with value: {:?}",
key, value
)
});
debug!("component {:?}", component);
debug!("real type {:?}", component.get_represented_type_info());
components.push((component, type_registration.clone()));
debug!("found type registration for {}", capitalized_type_name);
} else {
warn!("no type registration for {}", capitalized_type_name);
}
}
components
}

View File

@ -0,0 +1,3 @@
pub fn capitalize_first_letter(s: &str) -> String {
s[0..1].to_uppercase() + &s[1..]
}

View File

@ -0,0 +1,21 @@
[package]
name = "bevy_gltf_save_load"
version = "0.5.0"
authors = ["Mark 'kaosat-dev' Moissette"]
description = "Save & load your bevy games"
homepage = "https://github.com/kaosat-dev/Blender_bevy_components_workflow"
repository = "https://github.com/kaosat-dev/Blender_bevy_components_workflow"
keywords = ["gamedev", "bevy", "save", "load", "serialize"]
categories = ["game-development"]
edition = "2021"
license = "MIT OR Apache-2.0"
[lints]
workspace = true
[dependencies]
bevy = { version = "0.14", default-features = false, features = ["bevy_asset", "bevy_scene", "bevy_gltf"] }
bevy_gltf_blueprints = { version = "0.11", path = "../bevy_gltf_blueprints" }
[dev-dependencies]
bevy = { version = "0.14", default-features = false, features = ["dynamic_linking"] }

View File

@ -0,0 +1,4 @@
This crate is available under either:
* The [MIT License](./LICENSE_MIT)
* The [Apache License, Version 2.0](./LICENSE_APACHE)

View File

@ -0,0 +1,201 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [2023] [Mark "kaosat-dev" Moissette]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@ -0,0 +1,21 @@
MIT License
Copyright (c) 2023 Mark "kaosat-dev" Moissette
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View File

@ -0,0 +1,216 @@
use gltf_json as json;
use json::camera::Type;
use json::validation::{Checked, Validate};
use serde_json::value::{to_raw_value, RawValue};
use serde::Serialize;
use bevy::reflect::TypeRegistryArc;
#[derive(Clone, Copy, Debug, Eq, Hash, PartialEq)]
enum Output {
/// Output standard glTF.
Standard,
/// Output binary glTF.
Binary,
}
#[derive(Serialize)]
struct MyExtraData {
a: u32,
b: u32,
BlueprintName: String,
SpawnHere: String,
}
/*
pub fn serialize_gltf_inner<S>(serialize: S) -> Result<String, json::Error>
where
S: Serialize,
{
let pretty_config = ron::ser::PrettyConfig::default()
.indentor(" ".to_string())
.new_line("\n".to_string());
ron::ser::to_string_pretty(&serialize, pretty_config)
}*/
pub fn serialize_gltf(scene:&DynamicScene, registry: &TypeRegistryArc) {
}
pub fn save_game(
world: &mut World,
) {
let mut save_path:String = "".into();
let mut events = world
.resource_mut::<Events<SaveRequest>>();
for event in events.get_reader().read(&events) {
info!("SAVE EVENT !! {:?}", event);
save_path = event.path.clone();
}
info!("SAVING TO {}", save_path);
events.clear();
let saveable_entities: Vec<Entity> = world
.query_filtered::<Entity, With<Dynamic>>()
.iter(world)
.collect();
debug!("saveable entities {}", saveable_entities.len());
let components = HashSet::from([
TypeId::of::<Name>(),
TypeId::of::<Transform>(),
TypeId::of::<Velocity>() ,
TypeId::of::<BlueprintName>(),
TypeId::of::<SpawnHere>(),
TypeId::of::<Dynamic>(),
TypeId::of::<Camera>(),
TypeId::of::<Camera3d>(),
TypeId::of::<Tonemapping>(),
TypeId::of::<CameraTrackingOffset>(),
TypeId::of::<Projection>(),
TypeId::of::<CameraRenderGraph>(),
TypeId::of::<Frustum>(),
TypeId::of::<GlobalTransform>(),
TypeId::of::<VisibleEntities>(),
TypeId::of::<Pickable>(),
]);
let filter = SceneFilter::Allowlist(components);
let mut scene_builder = DynamicSceneBuilder::from_world(world).with_filter(filter);
let dyn_scene = scene_builder
/* .allow::<Transform>()
.allow::<Velocity>()
.allow::<BlueprintName>()*/
/* .deny::<Children>()
.deny::<Parent>()
.deny::<InheritedVisibility>()
.deny::<Visibility>()
.deny::<GltfExtras>()
.deny::<GlobalTransform>()
.deny::<Collider>()
.deny::<RigidBody>()
.deny::<Saveable>()
// camera stuff
.deny::<Camera>()
.deny::<CameraRenderGraph>()
.deny::<Camera3d>()
.deny::<Clusters>()
.deny::<VisibleEntities>()
.deny::<VisiblePointLights>()
//.deny::<HasGizmoMarker>()
*/
.extract_entities(saveable_entities.into_iter())
.build();
let serialized_scene = dyn_scene
.serialize_ron(world.resource::<AppTypeRegistry>())
.unwrap();
let mut root = gltf_json::Root::default();
// unfortunatly, not available yet
/*let node = root.push(json::Node {
//mesh: Some(mesh),
..Default::default()
});
root.push(json::Scene {
extensions: Default::default(),
extras: Default::default(),
name: None,
nodes: vec![node],
});*/
let camera = json::camera::Perspective{
aspect_ratio: Some(0.5),
yfov: 32.0,
zfar: Some(30.),
znear: 0.0,
extensions: None,
extras: None
};
/*let camera = json::Camera{
name:Some("Camera".into()),
orthographic: None,
perspective:None,
extensions: None,
extras: None,
type_: Checked<Type::Perspective>,
};*/
let gna = to_raw_value(&MyExtraData { a: 1, b: 2, BlueprintName: "Foo".into(), SpawnHere:"".into() }).unwrap() ;
let node = json::Node {
camera: None,//Some(camera),
children: None,
extensions: None,
extras: Some(gna),
matrix: None,
mesh:None,
name: Some("yeah".into()),
rotation: None,
scale: None,
translation: Some([0.5, 10.0 ,-100.]),
skin: None,
weights: None
// mesh: Some(json::Index::new(0)),
//..Default::default()
};
let root = json::Root {
accessors: vec![], //[positions, colors],
buffers: vec![],
buffer_views: vec![],
meshes: vec![],
nodes: vec![node],
scenes: vec![json::Scene {
extensions: Default::default(),
extras: Default::default(),
name: Some("Foo".to_string()),
nodes: vec![json::Index::new(0)],
}],
..Default::default()
};
let gltf_save_name = "test.gltf";
let writer = fs::File::create(format!("assets/scenes/{gltf_save_name}") ).expect("I/O error");
json::serialize::to_writer_pretty(writer, &root).expect("Serialization error");
// let bin = to_padded_byte_vector(triangle_vertices);
// let mut writer = fs::File::create("triangle/buffer0.bin").expect("I/O error");
// writer.write_all(&bin).expect("I/O error");
#[cfg(not(target_arch = "wasm32"))]
IoTaskPool::get()
.spawn(async move {
// Write the scene RON data to file
File::create(format!("assets/scenes/{save_path}"))
.and_then(|mut file| file.write(serialized_scene.as_bytes()))
.expect("Error while writing scene to file");
})
.detach();
}

View File

@ -32,7 +32,7 @@ pub(crate) fn mark_load_requested(
let mut save_path: String = "".into(); let mut save_path: String = "".into();
for load_request in load_requests.read() { for load_request in load_requests.read() {
if !load_request.path.is_empty() { if !load_request.path.is_empty() {
save_path.clone_from(&load_request.path); save_path = load_request.path.clone();
} }
} }
if !save_path.is_empty() { if !save_path.is_empty() {

View File

@ -80,7 +80,7 @@ pub(crate) fn save_game(world: &mut World) {
for event in events.get_reader().read(&events) { for event in events.get_reader().read(&events) {
info!("SAVE EVENT !! {:?}", event); info!("SAVE EVENT !! {:?}", event);
save_path.clone_from(&event.path); save_path = event.path.clone();
} }
events.clear(); events.clear();

View File

@ -0,0 +1,21 @@
[package]
name = "bevy_registry_export"
version = "0.4.0"
authors = ["Mark 'kaosat-dev' Moissette", "Pascal 'Killercup' Hertleif"]
description = "Allows you to create a Json export of all your components/ registered types of your Bevy app/game"
homepage = "https://github.com/kaosat-dev/Blender_bevy_components_workflow"
repository = "https://github.com/kaosat-dev/Blender_bevy_components_workflow"
keywords = ["gamedev", "bevy", "assets", "registry", "components"]
categories = ["game-development"]
edition = "2021"
license = "MIT OR Apache-2.0"
[dependencies]
bevy = { version = "0.14", default-features = false, features = ["bevy_scene"] }
bevy_reflect = { version = "0.14", default-features = false }
bevy_app = { version = "0.14", default-features = false, features = ["bevy_reflect"] }
bevy_ecs = { version = "0.14", default-features = false, features = ["bevy_reflect"] }
serde_json = "1.0.108"
[dev-dependencies]
bevy = { version = "0.14", default-features = false, features = ["dynamic_linking"] }

View File

@ -0,0 +1,4 @@
This crate is available under either:
* The [MIT License](./LICENSE_MIT)
* The [Apache License, Version 2.0](./LICENSE_APACHE)

View File

@ -0,0 +1,201 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [2024] [Mark "kaosat-dev" Moissette]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@ -0,0 +1,21 @@
MIT License
Copyright (c) 2024 Mark "kaosat-dev" Moissette
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View File

@ -0,0 +1,269 @@
use std::{fs::File, path::Path};
use bevy::log::info;
use bevy_ecs::{
reflect::{AppTypeRegistry, ReflectComponent, ReflectResource},
world::World,
};
use bevy_reflect::{TypeInfo, TypeRegistration, VariantInfo}; // TypePath // DynamicTypePath
use serde_json::{json, Map, Value};
use crate::{AssetRoot, ExportComponentsConfig};
pub fn export_types(world: &mut World) {
let config = world
.get_resource::<ExportComponentsConfig>()
.expect("ExportComponentsConfig should exist at this stage");
let asset_root = world.resource::<AssetRoot>();
let registry_save_path = Path::join(&asset_root.0, &config.save_path);
println!("registry_save_path {}", registry_save_path.display());
let writer = File::create(registry_save_path).expect("should have created schema file");
let types = world.resource_mut::<AppTypeRegistry>();
let types = types.read();
let schemas = types.iter().map(export_type).collect::<Map<_, _>>();
serde_json::to_writer_pretty(
writer,
&json!({
"$schema": "https://json-schema.org/draft/2020-12/schema",
"title": "bevy component registry schema",
"$defs": schemas,
}),
)
.expect("valid json");
info!("Done exporting registry schema")
}
pub fn export_type(reg: &TypeRegistration) -> (String, Value) {
let t = reg.type_info();
let binding = t.type_path_table();
let short_name = binding.short_path();
let mut schema = match t {
TypeInfo::Struct(info) => {
let properties = info
.iter()
.enumerate()
.map(|(idx, field)| {
(
field.name().to_owned(),
add_min_max(json!({ "type": typ(field.type_path()) }), reg, idx, None),
)
})
.collect::<Map<_, _>>();
json!({
"type": "object",
"typeInfo": "Struct",
"title": t.type_path(),
"properties": properties,
"additionalProperties": false,
"required": info
.iter()
.filter(|field| !field.type_path().starts_with("core::option::Option"))
.map(|field| field.name())
.collect::<Vec<_>>(),
})
}
TypeInfo::Enum(info) => {
let simple = info
.iter()
.all(|variant| matches!(variant, VariantInfo::Unit(_)));
if simple {
json!({
"type": "string",
"typeInfo": "Enum",
"title": t.type_path(),
"oneOf": info
.iter()
.map(|variant| match variant {
VariantInfo::Unit(v) => v.name(),
_ => unreachable!(),
})
.collect::<Vec<_>>(),
})
} else {
let variants = info
.iter()
.enumerate()
.map(|(field_idx, variant)| match variant {
//let binding = t.type_path_table();
//let short_name = binding.short_path();
VariantInfo::Struct(v) => json!({
"type": "object",
"typeInfo": "Struct",
"title": v.name(),
"short_name": v.name().split("::").last().unwrap_or(v.name()),
"properties": v
.iter()
.enumerate()
.map(|(variant_idx, field)| (field.name().to_owned(), add_min_max(json!({"type": typ(field.type_path()), "title": field.name()}), reg, field_idx, Some(variant_idx))))
.collect::<Map<_, _>>(),
"additionalProperties": false,
"required": v
.iter()
.filter(|field| !field.type_path().starts_with("core::option::Option"))
.map(|field| field.name())
.collect::<Vec<_>>(),
}),
VariantInfo::Tuple(v) => json!({
"type": "array",
"typeInfo": "Tuple",
"title": v.name(),
"short_name":v.name(),
"prefixItems": v
.iter()
.enumerate()
.map(|(variant_idx, field)| add_min_max(json!({"type": typ(field.type_path())}), reg, field_idx, Some(variant_idx)))
.collect::<Vec<_>>(),
"items": false,
}),
VariantInfo::Unit(v) => json!({
"title": v.name(),
}),
})
.collect::<Vec<_>>();
json!({
"type": "object",
"typeInfo": "Enum",
"title": t.type_path(),
"oneOf": variants,
})
}
}
TypeInfo::TupleStruct(info) => json!({
"title": t.type_path(),
"type": "array",
"typeInfo": "TupleStruct",
"prefixItems": info
.iter()
.enumerate()
.map(|(idx, field)| add_min_max(json!({"type": typ(field.type_path())}), reg, idx, None))
.collect::<Vec<_>>(),
"items": false,
}),
TypeInfo::List(info) => {
json!({
"title": t.type_path(),
"type": "array",
"typeInfo": "List",
"items": json!({"type": typ(info.item_type_path_table().path())}),
})
}
TypeInfo::Array(info) => json!({
"title": t.type_path(),
"type": "array",
"typeInfo": "Array",
"items": json!({"type": typ(info.item_type_path_table().path())}),
}),
TypeInfo::Map(info) => json!({
"title": t.type_path(),
"type": "object",
"typeInfo": "Map",
"additionalProperties": json!({"type": typ(info.value_type_path_table().path())}),
}),
TypeInfo::Tuple(info) => json!({
"title": t.type_path(),
"type": "array",
"typeInfo": "Tuple",
"prefixItems": info
.iter()
.enumerate()
.map(|(idx, field)| add_min_max(json!({"type": typ(field.type_path())}), reg, idx, None))
.collect::<Vec<_>>(),
"items": false,
}),
TypeInfo::Value(info) => json!({
"title": t.type_path(),
"type": map_json_type(info.type_path()),
"typeInfo": "Value",
}),
};
schema.as_object_mut().unwrap().insert(
"isComponent".to_owned(),
reg.data::<ReflectComponent>().is_some().into(),
);
schema.as_object_mut().unwrap().insert(
"isResource".to_owned(),
reg.data::<ReflectResource>().is_some().into(),
);
schema
.as_object_mut()
.unwrap()
.insert("short_name".to_owned(), short_name.into());
(t.type_path().to_owned(), schema)
}
fn typ(t: &str) -> Value {
json!({ "$ref": format!("#/$defs/{t}") })
}
fn map_json_type(t: &str) -> Value {
match t {
"bool" => "boolean",
"u8" | "u16" | "u32" | "u64" | "u128" | "usize" => "uint",
"i8" | "i16" | "i32" | "i64" | "i128" | "isize" => "int",
"f32" | "f64" => "float",
"char" | "str" | "alloc::string::String" => "string",
_ => "object",
}
.into()
}
fn add_min_max(
mut val: Value,
reg: &TypeRegistration,
field_index: usize,
variant_index: Option<usize>,
) -> Value {
#[cfg(feature = "support-inspector")]
fn get_min_max(
reg: &TypeRegistration,
field_index: usize,
variant_index: Option<usize>,
) -> Option<(Option<f32>, Option<f32>)> {
use bevy_inspector_egui::inspector_options::{
std_options::NumberOptions, ReflectInspectorOptions, Target,
};
reg.data::<ReflectInspectorOptions>()
.and_then(|ReflectInspectorOptions(o)| {
o.get(if let Some(variant_index) = variant_index {
Target::VariantField {
variant_index,
field_index,
}
} else {
Target::Field(field_index)
})
})
.and_then(|o| o.downcast_ref::<NumberOptions<f32>>())
.map(|num| (num.min, num.max))
}
#[cfg(not(feature = "support-inspector"))]
fn get_min_max(
_reg: &TypeRegistration,
_field_index: usize,
_variant_index: Option<usize>,
) -> Option<(Option<f32>, Option<f32>)> {
None
}
let Some((min, max)) = get_min_max(reg, field_index, variant_index) else {
return val;
};
let obj = val.as_object_mut().unwrap();
if let Some(min) = min {
obj.insert("minimum".to_owned(), min.into());
}
if let Some(max) = max {
obj.insert("maximum".to_owned(), max.into());
}
val
}

View File

@ -0,0 +1,73 @@
pub mod export_types;
use std::path::PathBuf;
use bevy_app::Startup;
use bevy_ecs::system::Resource;
pub use export_types::*;
use bevy::{
asset::AssetPlugin,
prelude::{App, Plugin},
scene::SceneFilter,
};
// Plugin configuration
#[derive(Clone, Resource)]
pub struct ExportComponentsConfig {
pub(crate) save_path: PathBuf,
#[allow(dead_code)]
pub(crate) component_filter: SceneFilter, // unused for now
#[allow(dead_code)]
pub(crate) resource_filter: SceneFilter, // unused for now
}
pub struct ExportRegistryPlugin {
pub component_filter: SceneFilter,
pub resource_filter: SceneFilter,
pub save_path: PathBuf,
}
impl Default for ExportRegistryPlugin {
fn default() -> Self {
Self {
component_filter: SceneFilter::default(), // unused for now
resource_filter: SceneFilter::default(), // unused for now
save_path: PathBuf::from("registry.json"), // relative to assets folder
}
}
}
impl Plugin for ExportRegistryPlugin {
fn build(&self, app: &mut App) {
app.register_asset_root()
.insert_resource(ExportComponentsConfig {
save_path: self.save_path.clone(),
component_filter: self.component_filter.clone(),
resource_filter: self.resource_filter.clone(),
})
.add_systems(Startup, export_types);
}
}
trait RegistryExportApp {
fn register_asset_root(&mut self) -> &mut Self;
}
impl RegistryExportApp for App {
fn register_asset_root(&mut self) -> &mut Self {
let asset_plugin = get_asset_plugin(self);
let path_str = asset_plugin.file_path.clone();
let path = PathBuf::from(path_str);
self.insert_resource(AssetRoot(path))
}
}
fn get_asset_plugin(app: &App) -> &AssetPlugin {
let asset_plugins: Vec<&AssetPlugin> = app.get_added_plugins();
asset_plugins.into_iter().next().expect(ASSET_ERROR)
}
const ASSET_ERROR: &str = "Bevy_registry_export requires access to the Bevy asset plugin. \
Please add `ExportRegistryPlugin` after `AssetPlugin`, which is commonly added as part of the `DefaultPlugins`";
#[derive(Debug, Clone, PartialEq, Eq, Hash, Resource)]
pub(crate) struct AssetRoot(pub(crate) PathBuf);

View File

@ -1,6 +1,6 @@
[package] [package]
name = "blenvy" name = "blenvy"
version = "0.1.0-alpha.1" version = "0.1.0"
authors = ["Mark 'kaosat-dev' Moissette"] authors = ["Mark 'kaosat-dev' Moissette"]
description = "Allows you to define Bevy components direclty inside gltf files and instanciate the components on the Bevy side." description = "Allows you to define Bevy components direclty inside gltf files and instanciate the components on the Bevy side."
homepage = "https://github.com/kaosat-dev/Blenvy" homepage = "https://github.com/kaosat-dev/Blenvy"

View File

@ -1,4 +1,4 @@
# Examples # Examples
This folder contains numerous examples showing how to use Blenvy This folder contains numerous examples showing how to use both bevy_gltf_components and bevy_gltf_blueprints.
Each example is its own crate so its dependencies and assets are specific & clear. Each example is its own crate so its dependencies are specific & clear.

View File

@ -1,8 +1,54 @@
use bevy::prelude::*; use bevy::prelude::*;
use bevy::render::mesh::PrimitiveTopology; use bevy::render::mesh::{MeshVertexAttributeId, PrimitiveTopology, VertexAttributeValues};
// TAKEN VERBATIB FROM https://github.com/janhohenheim/foxtrot/blob/src/util/trait_extension.rs // TAKEN VERBATIB FROM https://github.com/janhohenheim/foxtrot/blob/src/util/trait_extension.rs
pub(crate) trait Vec3Ext: Copy {
fn is_approx_zero(self) -> bool;
fn split(self, up: Vec3) -> SplitVec3;
}
impl Vec3Ext for Vec3 {
#[inline]
fn is_approx_zero(self) -> bool {
self.length_squared() < 1e-5
}
#[inline]
fn split(self, up: Vec3) -> SplitVec3 {
let vertical = up * self.dot(up);
let horizontal = self - vertical;
SplitVec3 {
vertical,
horizontal,
}
}
}
#[derive(Debug, Clone, Copy, PartialEq)]
pub(crate) struct SplitVec3 {
pub(crate) vertical: Vec3,
pub(crate) horizontal: Vec3,
}
pub(crate) trait Vec2Ext: Copy {
fn is_approx_zero(self) -> bool;
fn x0y(self) -> Vec3;
}
impl Vec2Ext for Vec2 {
#[inline]
fn is_approx_zero(self) -> bool {
self.length_squared() < 1e-5
}
#[inline]
fn x0y(self) -> Vec3 {
Vec3::new(self.x, 0., self.y)
}
}
pub(crate) trait MeshExt { pub(crate) trait MeshExt {
fn transform(&mut self, transform: Transform);
fn transformed(&self, transform: Transform) -> Mesh;
fn read_coords_mut(&mut self, id: impl Into<MeshVertexAttributeId>) -> &mut Vec<[f32; 3]>;
fn search_in_children<'a>( fn search_in_children<'a>(
parent: Entity, parent: Entity,
children: &'a Query<&Children>, children: &'a Query<&Children>,
@ -12,6 +58,37 @@ pub(crate) trait MeshExt {
} }
impl MeshExt for Mesh { impl MeshExt for Mesh {
fn transform(&mut self, transform: Transform) {
for coords in self.read_coords_mut(Mesh::ATTRIBUTE_POSITION.clone()) {
let vec3 = (*coords).into();
let transformed = transform.transform_point(vec3);
*coords = transformed.into();
}
for normal in self.read_coords_mut(Mesh::ATTRIBUTE_NORMAL.clone()) {
let vec3 = (*normal).into();
let transformed = transform.rotation.mul_vec3(vec3);
*normal = transformed.into();
}
}
fn transformed(&self, transform: Transform) -> Mesh {
let mut mesh = self.clone();
mesh.transform(transform);
mesh
}
fn read_coords_mut(&mut self, id: impl Into<MeshVertexAttributeId>) -> &mut Vec<[f32; 3]> {
// Guaranteed by Bevy for the current usage
match self
.attribute_mut(id)
.expect("Failed to read unknown mesh attribute")
{
VertexAttributeValues::Float32x3(values) => values,
// Guaranteed by Bevy for the current usage
_ => unreachable!(),
}
}
fn search_in_children<'a>( fn search_in_children<'a>(
parent: Entity, parent: Entity,
children_query: &'a Query<&Children>, children_query: &'a Query<&Children>,
@ -48,3 +125,51 @@ impl MeshExt for Mesh {
} }
} }
} }
pub(crate) trait F32Ext: Copy {
fn is_approx_zero(self) -> bool;
fn squared(self) -> f32;
fn lerp(self, other: f32, ratio: f32) -> f32;
}
impl F32Ext for f32 {
#[inline]
fn is_approx_zero(self) -> bool {
self.abs() < 1e-5
}
#[inline]
fn squared(self) -> f32 {
self * self
}
#[inline]
fn lerp(self, other: f32, ratio: f32) -> f32 {
self.mul_add(1. - ratio, other * ratio)
}
}
pub(crate) trait TransformExt: Copy {
fn horizontally_looking_at(self, target: Vec3, up: Vec3) -> Transform;
fn lerp(self, other: Transform, ratio: f32) -> Transform;
}
impl TransformExt for Transform {
fn horizontally_looking_at(self, target: Vec3, up: Vec3) -> Transform {
let direction = target - self.translation;
let horizontal_direction = direction - up * direction.dot(up);
let look_target = self.translation + horizontal_direction;
self.looking_at(look_target, up)
}
fn lerp(self, other: Transform, ratio: f32) -> Transform {
let translation = self.translation.lerp(other.translation, ratio);
let rotation = self.rotation.slerp(other.rotation, ratio);
let scale = self.scale.lerp(other.scale, ratio);
Transform {
translation,
rotation,
scale,
}
}
}

View File

@ -1,8 +1,54 @@
use bevy::prelude::*; use bevy::prelude::*;
use bevy::render::mesh::PrimitiveTopology; use bevy::render::mesh::{MeshVertexAttributeId, PrimitiveTopology, VertexAttributeValues};
// TAKEN VERBATIB FROM https://github.com/janhohenheim/foxtrot/blob/src/util/trait_extension.rs // TAKEN VERBATIB FROM https://github.com/janhohenheim/foxtrot/blob/src/util/trait_extension.rs
pub(crate) trait Vec3Ext: Copy {
fn is_approx_zero(self) -> bool;
fn split(self, up: Vec3) -> SplitVec3;
}
impl Vec3Ext for Vec3 {
#[inline]
fn is_approx_zero(self) -> bool {
self.length_squared() < 1e-5
}
#[inline]
fn split(self, up: Vec3) -> SplitVec3 {
let vertical = up * self.dot(up);
let horizontal = self - vertical;
SplitVec3 {
vertical,
horizontal,
}
}
}
#[derive(Debug, Clone, Copy, PartialEq)]
pub(crate) struct SplitVec3 {
pub(crate) vertical: Vec3,
pub(crate) horizontal: Vec3,
}
pub(crate) trait Vec2Ext: Copy {
fn is_approx_zero(self) -> bool;
fn x0y(self) -> Vec3;
}
impl Vec2Ext for Vec2 {
#[inline]
fn is_approx_zero(self) -> bool {
self.length_squared() < 1e-5
}
#[inline]
fn x0y(self) -> Vec3 {
Vec3::new(self.x, 0., self.y)
}
}
pub(crate) trait MeshExt { pub(crate) trait MeshExt {
fn transform(&mut self, transform: Transform);
fn transformed(&self, transform: Transform) -> Mesh;
fn read_coords_mut(&mut self, id: impl Into<MeshVertexAttributeId>) -> &mut Vec<[f32; 3]>;
fn search_in_children<'a>( fn search_in_children<'a>(
parent: Entity, parent: Entity,
children: &'a Query<&Children>, children: &'a Query<&Children>,
@ -12,6 +58,37 @@ pub(crate) trait MeshExt {
} }
impl MeshExt for Mesh { impl MeshExt for Mesh {
fn transform(&mut self, transform: Transform) {
for coords in self.read_coords_mut(Mesh::ATTRIBUTE_POSITION.clone()) {
let vec3 = (*coords).into();
let transformed = transform.transform_point(vec3);
*coords = transformed.into();
}
for normal in self.read_coords_mut(Mesh::ATTRIBUTE_NORMAL.clone()) {
let vec3 = (*normal).into();
let transformed = transform.rotation.mul_vec3(vec3);
*normal = transformed.into();
}
}
fn transformed(&self, transform: Transform) -> Mesh {
let mut mesh = self.clone();
mesh.transform(transform);
mesh
}
fn read_coords_mut(&mut self, id: impl Into<MeshVertexAttributeId>) -> &mut Vec<[f32; 3]> {
// Guaranteed by Bevy for the current usage
match self
.attribute_mut(id)
.expect("Failed to read unknown mesh attribute")
{
VertexAttributeValues::Float32x3(values) => values,
// Guaranteed by Bevy for the current usage
_ => unreachable!(),
}
}
fn search_in_children<'a>( fn search_in_children<'a>(
parent: Entity, parent: Entity,
children_query: &'a Query<&Children>, children_query: &'a Query<&Children>,
@ -48,3 +125,51 @@ impl MeshExt for Mesh {
} }
} }
} }
pub(crate) trait F32Ext: Copy {
fn is_approx_zero(self) -> bool;
fn squared(self) -> f32;
fn lerp(self, other: f32, ratio: f32) -> f32;
}
impl F32Ext for f32 {
#[inline]
fn is_approx_zero(self) -> bool {
self.abs() < 1e-5
}
#[inline]
fn squared(self) -> f32 {
self * self
}
#[inline]
fn lerp(self, other: f32, ratio: f32) -> f32 {
self.mul_add(1. - ratio, other * ratio)
}
}
pub(crate) trait TransformExt: Copy {
fn horizontally_looking_at(self, target: Vec3, up: Vec3) -> Transform;
fn lerp(self, other: Transform, ratio: f32) -> Transform;
}
impl TransformExt for Transform {
fn horizontally_looking_at(self, target: Vec3, up: Vec3) -> Transform {
let direction = target - self.translation;
let horizontal_direction = direction - up * direction.dot(up);
let look_target = self.translation + horizontal_direction;
self.looking_at(look_target, up)
}
fn lerp(self, other: Transform, ratio: f32) -> Transform {
let translation = self.translation.lerp(other.translation, ratio);
let rotation = self.rotation.slerp(other.rotation, ratio);
let scale = self.scale.lerp(other.scale, ratio);
Transform {
translation,
rotation,
scale,
}
}
}

View File

@ -6,13 +6,13 @@ This [Blender addon](https://github.com/kaosat-dev/Blender_bevy_components_workf
- the UI is **automatically generated** based on a **registry schema** file, an export of all your **registered** Bevy components's information, generated - the UI is **automatically generated** based on a **registry schema** file, an export of all your **registered** Bevy components's information, generated
by the registry export part of the [Blenvy](https://crates.io/crates/blenvy) crate by the registry export part of the [Blenvy](https://crates.io/crates/blenvy) crate
- the ability to **toggle components** on/off without having to remove the component from the object - the ability to **toggle components** on/off without having to remove the component from the object
- an easy way to create blueprints/prefabs (just collections !) & levels
- a way to setup you assets for your levels & blueprints
- an automatic export of your level/world from Blender to gltf whenever you save your Blend file. - an automatic export of your level/world from Blender to gltf whenever you save your Blend file.
- export of used /marked collections as [Gltf blueprints](../../crates/blenvy/README.md) - export of used /marked collections as [Gltf blueprints](../../crates/blenvy/README.md)
- change detection, so that only the levels & blueprints you have changed get exported when you save your blend file - change detection, so that only the levels & blueprints you have changed get exported when you save your blend file
- export of material librairies - export of material librairies
- a way to setup you assets for your levels & blueprints in Blender
If you want to know more about the technical details , see [here]() If you want to know more about the technical details , see [here]()
@ -36,90 +36,34 @@ If you can I would generally recommend starting fresh, but a lot of effort has b
![blender addon install](./docs/blender_addon_install2.png) ![blender addon install](./docs/blender_addon_install2.png)
* up to Blender 4.1
* for Blender 4.2 , just drag & drop the zip file onto Blender to start the installation process
## Quickstart ## Quickstart
* set your level & library scenes (the only things that are not pre-configured)
![blenvy common settings](./docs/blenvy_configuration_common.png)
* create your blueprints & levels
* add components (remember to configure the Bevy side first )
* save your blend file at any point , the rest is done automatically (export of levels & blueprints, etc)
## Configuration: ## Configuration:
### Bevy side ### Bevy side
- setup the [Blenvy crate](https://crates.io/crates/blenvy) for your project (see the crate's documentation for that), and compile/run it to get the ```registry.json``` file to enable adding/editing your components in Blender - setup the [Blenvy crate](https://crates.io/crates/blenvy) for your project (see the crate's documentation for that), and compile/run it to get the ```registry.json``` file to enable adding/editing your components in Blender
### Blender side ### Blender side
> The add-on comes almost mostly pre-configured with sensible defaults, but you can set the following settings to your liking
> The add-on comes almost completely pre-configured with sensible defaults, but you can set the following settings to your liking
#### Common #### Common
The first tab (and the one that is open by default in a new project) contains the common settings:
![blenvy common settings](./docs/blenvy_configuration_common.png)
you **need** to tell Blenvy you **need** to tell Blenvy
- what your level scenes are (what Blender scenes should become levels in Bevy) - what your level scenes are (what Blender scenes should become levels in Bevy)
- what your library scenes are (what Blender scenes will store your library of re-useable blueprints) - what your library scenes are (what Blender scenes will store your library of re-useable blueprints)
Blenvy is opinionated ! Blenvy is opinionated !
- keep you art/sources (usually not delivered with your game) seperate from your game assets - keep you art/sources (usually not delivered with your game) seperate from your game assets
- keep your blueprints/levels/materials gltf files seperate - keep your blueprints/levels/materials gltf files seperate
##### Root Folder
- this is the same folder as your Bevy projects main folder: the path here is relative to the current .blend file
##### Assets Folder
- a path, relative to the *root* folder above, where you want to store your assets (delivered with your game)
##### Library Folder
- a path, relative to the *assets* folder above, where you want to store your *blueprints*
##### Levels Folder
- a path, relative to the *assets* folder above, where you want to store your *levels*
##### Materials Folder
- a path, relative to the *assets* folder above, where you want to store your *materials*
#####
- level scenes: what are the scenes in your .blend file that are levels/worlds
- library scenes: what are the scenes in your .blend file that contain your libraries of blueprints (that you then use in your levels)
#### Recomended folder structure
![recomended folder structure](./docs/blenvy_recommended_folder_structure.png)
![recomended folder structure art](./docs/blenvy_recommended_folder_structure_art.png)
![recomended folder structure assets](./docs/blenvy_recommended_folder_structure_assets.png)
#### Components #### Components
The second tab contains the component settings: > the defaults are already pre-set to match those on the Bevy side for the location of the ```registry.json``` file, unless you want to store it somewhere other than ```assets/registry.json```
![blenvy component settings](./docs/blenvy_configuration_components.png)
> you normally do not need to do anything, as the defaults are already pre-set to match those on the Bevy side for the location of the ```registry.json``` file, unless you want to store it somewhere other than ```assets/registry.json```
- Go to the new Components tab in the **configuration** tab - Go to the new Components tab in the **configuration** tab
@ -141,11 +85,7 @@ The second tab contains the component settings:
![registry file polling](./docs/registry_polling.png) ![registry file polling](./docs/registry_polling.png)
#### Export #### Auto-export
Last but not least, the export/ auto-export settings tab
![blenvy export settings](./docs/blenvy_configuration_export.png)
### Materials ### Materials
@ -172,7 +112,7 @@ If you want to use multiple blend files, use Blender's asset library etc, we got
There are only a few things to keep in mind There are only a few things to keep in mind
#### Assets/library/blueprints files #### Assets/library/blueprints files
- mark your library scenes as specified above, but **do NOT** specify a **level** scene - mark your library scenes as specified above, but **do NOT** specify a **main** scene
- mark any collection in your scenes as "assets" - mark any collection in your scenes as "assets"
- choose "split" for the combine mode (as you want your gltf blueprints to be saved for external use) - choose "split" for the combine mode (as you want your gltf blueprints to be saved for external use)
- do your Blender things as normal - do your Blender things as normal
@ -180,7 +120,7 @@ There are only a few things to keep in mind
- (optional) activate the **material library** option, so you only have one set of material per asset library (recomended) - (optional) activate the **material library** option, so you only have one set of material per asset library (recomended)
#### Level/world files #### Level/world files
- mark your level scenes as specified above ( personally I recommended **NOT** specifying a **library** scene in this case to keep things tidy, but that is up to you) - mark your main scenes as specified above ( personally I recommended **NOT** specifying a **library** scene in this case to keep things tidy, but that is up to you)
- configure your asset libraries as you would usually do, I recomend using the "link" mode so that any changes to asset files are reflected correctly - configure your asset libraries as you would usually do, I recomend using the "link" mode so that any changes to asset files are reflected correctly
- drag & drop any assets from the blueprints library (as you would normally do in Blender as well) - drag & drop any assets from the blueprints library (as you would normally do in Blender as well)
- choose "split" for the combine mode (as you want your gltf blueprints to be external usually & use the gltf files generated from your assets library) - choose "split" for the combine mode (as you want your gltf blueprints to be external usually & use the gltf files generated from your assets library)
@ -190,8 +130,6 @@ There are only a few things to keep in mind
Take a look at the [relevant](../../examples/demo/) example for more [details](../../examples/demo/art/) Take a look at the [relevant](../../examples/demo/) example for more [details](../../examples/demo/art/)
## Useage ## Useage
### Components ### Components
@ -321,7 +259,7 @@ ie this is an example scene...
![](./docs/workflow_original.jpg) ![](./docs/workflow_original.jpg)
and what actually gets exported for the level scene/world/level and what actually gets exported for the main scene/world/level
![](./docs/workflow_empties.jpg) ![](./docs/workflow_empties.jpg)

View File

@ -49,7 +49,7 @@ from .blueprints.operators import BLENVY_OT_blueprint_select
# blenvy core # blenvy core
from .core.blenvy_manager import BlenvyManager from .core.blenvy_manager import BlenvyManager
from .core.operators import BLENVY_OT_configuration_switch, BLENVY_OT_tooling_switch from .core.operators import BLENVY_OT_tooling_switch
from .core.ui.ui import (BLENVY_PT_SidePanel) from .core.ui.ui import (BLENVY_PT_SidePanel)
from .core.ui.scenes_list import BLENVY_OT_scenes_list_actions from .core.ui.scenes_list import BLENVY_OT_scenes_list_actions
from .assets.assets_folder_browser import BLENVY_OT_assets_paths_browse from .assets.assets_folder_browser import BLENVY_OT_assets_paths_browse
@ -110,7 +110,6 @@ classes = [
# blenvy # blenvy
BlenvyManager, BlenvyManager,
BLENVY_OT_tooling_switch, BLENVY_OT_tooling_switch,
BLENVY_OT_configuration_switch,
Asset, Asset,
AssetsRegistry, AssetsRegistry,

View File

@ -10,7 +10,7 @@ from ..blueprints.get_blueprints_to_export import get_blueprints_to_export
from ..levels.get_levels_to_export import get_levels_to_export from ..levels.get_levels_to_export import get_levels_to_export
from .export_gltf import get_standard_exporter_settings from .export_gltf import get_standard_exporter_settings
from ..levels.export_levels import export_level_scene from ..levels.export_levels import export_main_scene
from ..blueprints.export_blueprints import export_blueprints from ..blueprints.export_blueprints import export_blueprints
from .export_materials import cleanup_materials, export_materials from .export_materials import cleanup_materials, export_materials
from ..levels.bevy_scene_components import remove_scene_components, upsert_scene_components from ..levels.bevy_scene_components import remove_scene_components, upsert_scene_components
@ -52,7 +52,7 @@ def auto_export(changes_per_scene, changes_per_collection, changes_per_material,
if export_scene_settings: if export_scene_settings:
# inject/ update scene components # inject/ update scene components
upsert_scene_components(settings.level_scenes) upsert_scene_components(settings.main_scenes)
#inject/ update light shadow information #inject/ update light shadow information
for light in bpy.data.lights: for light in bpy.data.lights:
enabled = 'true' if light.use_shadow else 'false' enabled = 'true' if light.use_shadow else 'false'
@ -65,8 +65,8 @@ def auto_export(changes_per_scene, changes_per_collection, changes_per_material,
# get blueprints/collections infos # get blueprints/collections infos
(blueprints_to_export) = get_blueprints_to_export(changes_per_scene, changes_per_collection, changed_export_parameters, blueprints_data, settings) (blueprints_to_export) = get_blueprints_to_export(changes_per_scene, changes_per_collection, changed_export_parameters, blueprints_data, settings)
# get level scenes infos # get level/main scenes infos
(level_scenes_to_export) = get_levels_to_export(changes_per_scene, changes_per_collection, changed_export_parameters, blueprints_data, settings) (main_scenes_to_export) = get_levels_to_export(changes_per_scene, changes_per_collection, changed_export_parameters, blueprints_data, settings)
# since materials export adds components we need to call this before blueprints are exported # since materials export adds components we need to call this before blueprints are exported
# export materials & inject materials components into relevant objects # export materials & inject materials components into relevant objects
@ -75,7 +75,7 @@ def auto_export(changes_per_scene, changes_per_collection, changes_per_material,
export_materials(blueprints_data.blueprint_names, settings.library_scenes, settings) export_materials(blueprints_data.blueprint_names, settings.library_scenes, settings)
# update the list of tracked exports # update the list of tracked exports
exports_total = len(blueprints_to_export) + len(level_scenes_to_export) + (1 if export_materials_library else 0) exports_total = len(blueprints_to_export) + len(main_scenes_to_export) + (1 if export_materials_library else 0)
bpy.context.window_manager.auto_export_tracker.exports_total = exports_total bpy.context.window_manager.auto_export_tracker.exports_total = exports_total
bpy.context.window_manager.auto_export_tracker.exports_count = exports_total bpy.context.window_manager.auto_export_tracker.exports_count = exports_total
@ -92,19 +92,19 @@ def auto_export(changes_per_scene, changes_per_collection, changes_per_material,
print("-------------------------------") print("-------------------------------")
print("BLUEPRINTS: to export:", [blueprint.name for blueprint in blueprints_to_export]) print("BLUEPRINTS: to export:", [blueprint.name for blueprint in blueprints_to_export])
print("-------------------------------") print("-------------------------------")
print("MAIN SCENES: to export:", level_scenes_to_export) print("MAIN SCENES: to export:", main_scenes_to_export)
print("-------------------------------") print("-------------------------------")
# backup current active scene # backup current active scene
old_current_scene = bpy.context.scene old_current_scene = bpy.context.scene
# backup current selections # backup current selections
old_selections = bpy.context.selected_objects old_selections = bpy.context.selected_objects
# first export any level/world scenes # first export any main/level/world scenes
if len(level_scenes_to_export) > 0: if len(main_scenes_to_export) > 0:
print("export MAIN scenes") print("export MAIN scenes")
for scene_name in level_scenes_to_export: for scene_name in main_scenes_to_export:
print(" exporting scene:", scene_name) print(" exporting scene:", scene_name)
export_level_scene(bpy.data.scenes[scene_name], settings, blueprints_data) export_main_scene(bpy.data.scenes[scene_name], settings, blueprints_data)
# now deal with blueprints/collections # now deal with blueprints/collections
do_export_library_scene = not change_detection or changed_export_parameters or len(blueprints_to_export) > 0 do_export_library_scene = not change_detection or changed_export_parameters or len(blueprints_to_export) > 0
@ -122,8 +122,8 @@ def auto_export(changes_per_scene, changes_per_collection, changes_per_material,
cleanup_materials(blueprints_data.blueprint_names, settings.library_scenes) cleanup_materials(blueprints_data.blueprint_names, settings.library_scenes)
else: else:
for scene in settings.level_scenes: for scene in settings.main_scenes:
export_level_scene(scene, settings, []) export_main_scene(scene, settings, [])
@ -139,5 +139,5 @@ def auto_export(changes_per_scene, changes_per_collection, changes_per_material,
# FIXME: error handling ? also redundant # FIXME: error handling ? also redundant
if export_scene_settings: if export_scene_settings:
# inject/ update scene components # inject/ update scene components
remove_scene_components(settings.level_scenes) remove_scene_components(settings.main_scenes)

View File

@ -11,13 +11,9 @@ def prepare_and_export():
#bpy.context.window_manager.auto_export_tracker.disable_change_detection() #bpy.context.window_manager.auto_export_tracker.disable_change_detection()
blenvy = bpy.context.window_manager.blenvy blenvy = bpy.context.window_manager.blenvy
auto_export_settings = blenvy.auto_export auto_export_settings = blenvy.auto_export
# if there are no level or blueprint scenes, bail out early
if len(blenvy.level_scenes) == 0 and len(blenvy.library_scenes) == 0:
print("no level or library scenes, skipping auto export")
return
if auto_export_settings.auto_export: # only do the actual exporting if auto export is actually enabled if auto_export_settings.auto_export: # only do the actual exporting if auto export is actually enabled
# determine changed objects # determine changed objects
per_scene_changes, per_collection_changes, per_material_changes, project_hash = get_changes_per_scene(settings=blenvy) per_scene_changes, per_collection_changes, per_material_changes, project_hash = get_changes_per_scene(settings=blenvy)
# determine changed parameters # determine changed parameters

View File

@ -323,7 +323,7 @@ def serialize_project(settings):
print("serializing project") print("serializing project")
per_scene = {} per_scene = {}
for scene in settings.level_scenes + settings.library_scenes: #bpy.data.scenes: for scene in settings.main_scenes + settings.library_scenes: #bpy.data.scenes:
print("scene", scene.name) print("scene", scene.name)
# ignore temporary scenes # ignore temporary scenes
if scene.name.startswith(TEMPSCENE_PREFIX): if scene.name.startswith(TEMPSCENE_PREFIX):

View File

@ -10,7 +10,7 @@ parameter_names_whitelist_common = [
'blueprints_path', 'blueprints_path',
'levels_path', 'levels_path',
'materials_path', 'materials_path',
'level_scene_names', 'main_scene_names',
'library_scene_names', 'library_scene_names',
] ]

View File

@ -1,5 +1,5 @@
def upsert_scene_components(level_scenes): def upsert_scene_components(main_scenes):
for scene in level_scenes: for scene in main_scenes:
if scene.world is not None: if scene.world is not None:
scene['BlenderBackgroundShader'] = ambient_color_to_component(scene.world) scene['BlenderBackgroundShader'] = ambient_color_to_component(scene.world)
scene['BlenderShadowSettings'] = scene_shadows_to_component(scene) scene['BlenderShadowSettings'] = scene_shadows_to_component(scene)
@ -17,7 +17,7 @@ def upsert_scene_components(level_scenes):
scene['BlenderToneMapping'] = scene_tonemapping_to_component(scene) scene['BlenderToneMapping'] = scene_tonemapping_to_component(scene)
scene['BlenderColorGrading'] = scene_colorgrading_to_component(scene) scene['BlenderColorGrading'] = scene_colorgrading_to_component(scene)
def remove_scene_components(level_scenes): def remove_scene_components(main_scenes):
pass pass
def scene_tonemapping_to_component(scene): def scene_tonemapping_to_component(scene):

View File

@ -1,5 +1,10 @@
import json
import os import os
from blenvy.blueprints.blueprint_helpers import inject_blueprints_list_into_level_scene, remove_blueprints_list_from_level_scene from pathlib import Path
from types import SimpleNamespace
import bpy
from blenvy.blueprints.blueprint_helpers import inject_blueprints_list_into_main_scene, remove_blueprints_list_from_main_scene
from ..constants import TEMPSCENE_PREFIX from ..constants import TEMPSCENE_PREFIX
from ..common.generate_temporary_scene_and_export import generate_temporary_scene_and_export, copy_hollowed_collection_into, clear_hollow_scene from ..common.generate_temporary_scene_and_export import generate_temporary_scene_and_export, copy_hollowed_collection_into, clear_hollow_scene
from ..common.export_gltf import (generate_gltf_export_settings, export_gltf) from ..common.export_gltf import (generate_gltf_export_settings, export_gltf)
@ -7,7 +12,7 @@ from .is_object_dynamic import is_object_dynamic, is_object_static
from ..utils import upsert_scene_assets from ..utils import upsert_scene_assets
def export_level_scene(scene, settings, blueprints_data): def export_main_scene(scene, settings, blueprints_data):
gltf_export_settings = generate_gltf_export_settings(settings) gltf_export_settings = generate_gltf_export_settings(settings)
assets_path_full = getattr(settings,"assets_path_full") assets_path_full = getattr(settings,"assets_path_full")
levels_path_full = getattr(settings,"levels_path_full") levels_path_full = getattr(settings,"levels_path_full")
@ -27,7 +32,7 @@ def export_level_scene(scene, settings, blueprints_data):
if export_blueprints : if export_blueprints :
gltf_output_path = os.path.join(levels_path_full, scene.name) gltf_output_path = os.path.join(levels_path_full, scene.name)
inject_blueprints_list_into_level_scene(scene, blueprints_data, settings) inject_blueprints_list_into_main_scene(scene, blueprints_data, settings)
upsert_scene_assets(scene, blueprints_data=blueprints_data, settings=settings) upsert_scene_assets(scene, blueprints_data=blueprints_data, settings=settings)
if export_separate_dynamic_and_static_objects: if export_separate_dynamic_and_static_objects:
@ -67,7 +72,7 @@ def export_level_scene(scene, settings, blueprints_data):
tempScene_cleaner= lambda temp_scene, params: clear_hollow_scene(original_root_collection=scene.collection, temp_scene=temp_scene, **params) tempScene_cleaner= lambda temp_scene, params: clear_hollow_scene(original_root_collection=scene.collection, temp_scene=temp_scene, **params)
) )
remove_blueprints_list_from_level_scene(scene) remove_blueprints_list_from_main_scene(scene)
else: else:
gltf_output_path = os.path.join(assets_path_full, scene.name) gltf_output_path = os.path.join(assets_path_full, scene.name)

View File

@ -6,7 +6,7 @@ def changed_object_in_scene(scene_name, changes_per_scene, blueprints_data, coll
# Embed / EmbedExternal # Embed / EmbedExternal
blueprints_from_objects = blueprints_data.blueprints_from_objects blueprints_from_objects = blueprints_data.blueprints_from_objects
blueprint_instances_in_scene = blueprints_data.blueprint_instances_per_level_scene.get(scene_name, None) blueprint_instances_in_scene = blueprints_data.blueprint_instances_per_main_scene.get(scene_name, None)
if blueprint_instances_in_scene is not None: if blueprint_instances_in_scene is not None:
changed_objects = [object_name for change in changes_per_scene.values() for object_name in change.keys()] changed_objects = [object_name for change in changes_per_scene.values() for object_name in change.keys()]
changed_blueprints = [blueprints_from_objects[changed] for changed in changed_objects if changed in blueprints_from_objects] changed_blueprints = [blueprints_from_objects[changed] for changed in changed_objects if changed in blueprints_from_objects]
@ -27,7 +27,7 @@ def changed_object_in_scene(scene_name, changes_per_scene, blueprints_data, coll
elif combine_mode == 'EmbedExternal' and not blueprint.local: elif combine_mode == 'EmbedExternal' and not blueprint.local:
level_needs_export = True level_needs_export = True
break break
# changes => list of changed objects (regardless of wether they have been changed in level scene or in lib scene) # changes => list of changed objects (regardless of wether they have been changed in main scene or in lib scene)
# wich of those objects are blueprint instances # wich of those objects are blueprint instances
# we need a list of changed objects that are blueprint instances # we need a list of changed objects that are blueprint instances
return level_needs_export return level_needs_export
@ -56,8 +56,8 @@ def should_level_be_exported(scene_name, changed_export_parameters, changes_per_
# this also takes the split/embed mode into account: if a collection instance changes AND embed is active, its container level/world should also be exported # this also takes the split/embed mode into account: if a collection instance changes AND embed is active, its container level/world should also be exported
def get_levels_to_export(changes_per_scene, changes_per_collection, changed_export_parameters, blueprints_data, settings): def get_levels_to_export(changes_per_scene, changes_per_collection, changed_export_parameters, blueprints_data, settings):
# determine list of level scenes to export # determine list of main scenes to export
# we have more relaxed rules to determine if the level scenes have changed : any change is ok, (allows easier handling of changes, render settings etc) # we have more relaxed rules to determine if the main scenes have changed : any change is ok, (allows easier handling of changes, render settings etc)
level_scenes_to_export = [scene_name for scene_name in settings.level_scenes_names if should_level_be_exported(scene_name, changed_export_parameters, changes_per_scene, blueprints_data, settings)] main_scenes_to_export = [scene_name for scene_name in settings.main_scenes_names if should_level_be_exported(scene_name, changed_export_parameters, changes_per_scene, blueprints_data, settings)]
return (level_scenes_to_export) return (main_scenes_to_export)

View File

@ -19,7 +19,7 @@ class AutoExportSettings(PropertyGroup):
auto_export: BoolProperty( auto_export: BoolProperty(
name='Auto export', name='Auto export',
description='Automatically export to gltf on save', description='Automatically export to gltf on save',
default=True, default=False,
update=save_settings update=save_settings
) # type: ignore ) # type: ignore
@ -75,7 +75,7 @@ class AutoExportSettings(PropertyGroup):
export_materials_library: BoolProperty( export_materials_library: BoolProperty(
name='Export materials library', name='Export materials library',
description='remove materials from blueprints and use the material library instead', description='remove materials from blueprints and use the material library instead',
default=True, default=False,
update=save_settings update=save_settings
) # type: ignore ) # type: ignore

View File

@ -1,7 +1,7 @@
import bpy import bpy
import os import os
from pathlib import Path from pathlib import Path
from blenvy.assets.assets_scan import get_blueprint_asset_tree, get_level_scene_assets_tree2 from blenvy.assets.assets_scan import get_blueprint_asset_tree, get_main_scene_assets_tree2
def assets_to_fake_ron(list_like): def assets_to_fake_ron(list_like):
result = [] result = []
@ -15,14 +15,14 @@ def assets_to_fake_ron(list_like):
# TODO : move to assets # TODO : move to assets
def upsert_scene_assets(scene, blueprints_data, settings): def upsert_scene_assets(scene, blueprints_data, settings):
"""print("level scene", scene) """print("main scene", scene)
for asset in scene.user_assets: for asset in scene.user_assets:
print(" user asset", asset.name, asset.path) print(" user asset", asset.name, asset.path)
for asset in scene.generated_assets: for asset in scene.generated_assets:
print(" generated asset", asset)""" print(" generated asset", asset)"""
"""for blueprint in blueprints_data.blueprints_per_scenes[scene.name]: """for blueprint in blueprints_data.blueprints_per_scenes[scene.name]:
print("BLUEPRINT", blueprint)""" print("BLUEPRINT", blueprint)"""
blueprint_instances_in_scene = blueprints_data.blueprint_instances_per_level_scene.get(scene.name, {}).keys() blueprint_instances_in_scene = blueprints_data.blueprint_instances_per_main_scene.get(scene.name, {}).keys()
blueprints_in_scene = [blueprints_data.blueprints_per_name[blueprint_name] for blueprint_name in blueprint_instances_in_scene] blueprints_in_scene = [blueprints_data.blueprints_per_name[blueprint_name] for blueprint_name in blueprint_instances_in_scene]
#yala = [blueprint.collection.user_assets for blueprint in blueprints_in_scene] #yala = [blueprint.collection.user_assets for blueprint in blueprints_in_scene]
#print("dsfsdf", yala) #print("dsfsdf", yala)
@ -61,7 +61,7 @@ def upsert_scene_assets(scene, blueprints_data, settings):
print("material_assets", material_assets, "extension", export_gltf_extension) print("material_assets", material_assets, "extension", export_gltf_extension)
all_assets_raw = get_level_scene_assets_tree2(level_scene=scene, blueprints_data=blueprints_data, settings=settings) all_assets_raw = get_main_scene_assets_tree2(main_scene=scene, blueprints_data=blueprints_data, settings=settings)
local_assets = [{"name": asset["name"], "path": asset["path"]} for asset in all_assets_raw if asset['parent'] is None and asset["path"] != "" ] local_assets = [{"name": asset["name"], "path": asset["path"]} for asset in all_assets_raw if asset['parent'] is None and asset["path"] != "" ]
all_assets = [{"name": asset["name"], "path": asset["path"]} for asset in all_assets_raw if asset["path"] != "" ] all_assets = [{"name": asset["name"], "path": asset["path"]} for asset in all_assets_raw if asset["path"] != "" ]
print("all_assets_raw", all_assets_raw) print("all_assets_raw", all_assets_raw)

View File

@ -12,7 +12,7 @@ def scan_assets(scene, blueprints_data, settings):
export_gltf_extension = getattr(settings, "export_gltf_extension") export_gltf_extension = getattr(settings, "export_gltf_extension")
relative_blueprints_path = os.path.relpath(blueprints_path, project_root_path) relative_blueprints_path = os.path.relpath(blueprints_path, project_root_path)
blueprint_instance_names_for_scene = blueprints_data.blueprint_instances_per_level_scene.get(scene.name, None) blueprint_instance_names_for_scene = blueprints_data.blueprint_instances_per_main_scene.get(scene.name, None)
blueprint_assets_list = [] blueprint_assets_list = []
if blueprint_instance_names_for_scene: if blueprint_instance_names_for_scene:
@ -89,12 +89,12 @@ def get_blueprint_assets_tree(blueprint, blueprints_data, parent, settings):
assets_list += direct_assets assets_list += direct_assets
return assets_list return assets_list
def get_level_scene_assets_tree(level_scene, blueprints_data, settings): def get_main_scene_assets_tree(main_scene, blueprints_data, settings):
blueprints_path = getattr(settings, "blueprints_path") blueprints_path = getattr(settings, "blueprints_path")
export_gltf_extension = getattr(settings, "export_gltf_extension", ".glb") export_gltf_extension = getattr(settings, "export_gltf_extension", ".glb")
blueprint_instance_names_for_scene = blueprints_data.blueprint_instances_per_level_scene.get(level_scene.name, None) blueprint_instance_names_for_scene = blueprints_data.blueprint_instances_per_main_scene.get(main_scene.name, None)
assets_list = get_user_assets_as_list(level_scene) assets_list = get_user_assets_as_list(main_scene)
if blueprint_instance_names_for_scene: if blueprint_instance_names_for_scene:
for blueprint_name in blueprint_instance_names_for_scene: for blueprint_name in blueprint_instance_names_for_scene:
blueprint = blueprints_data.blueprints_per_name.get(blueprint_name, None) blueprint = blueprints_data.blueprints_per_name.get(blueprint_name, None)
@ -112,7 +112,7 @@ def get_level_scene_assets_tree(level_scene, blueprints_data, settings):
print("TOTAL ASSETS", assets_list) print("TOTAL ASSETS", assets_list)
# FIXME: do not do it here !! # FIXME: do not do it here !!
scene = bpy.data.scenes[level_scene.name] scene = bpy.data.scenes[main_scene.name]
scene.generated_assets.clear() scene.generated_assets.clear()
for asset in assets_list: for asset in assets_list:
if asset.get("generated", False): if asset.get("generated", False):
@ -122,12 +122,12 @@ def get_level_scene_assets_tree(level_scene, blueprints_data, settings):
return assets_list return assets_list
# same as the above, withouth the clutter below : TODO: unify # same as the above, withouth the clutter below : TODO: unify
def get_level_scene_assets_tree2(level_scene, blueprints_data, settings): def get_main_scene_assets_tree2(main_scene, blueprints_data, settings):
blueprints_path = getattr(settings, "blueprints_path") blueprints_path = getattr(settings, "blueprints_path")
export_gltf_extension = getattr(settings, "export_gltf_extension", ".glb") export_gltf_extension = getattr(settings, "export_gltf_extension", ".glb")
blueprint_instance_names_for_scene = blueprints_data.blueprint_instances_per_level_scene.get(level_scene.name, None) blueprint_instance_names_for_scene = blueprints_data.blueprint_instances_per_main_scene.get(main_scene.name, None)
assets_list = get_user_assets_as_list(level_scene) assets_list = get_user_assets_as_list(main_scene)
if blueprint_instance_names_for_scene: if blueprint_instance_names_for_scene:
for blueprint_name in blueprint_instance_names_for_scene: for blueprint_name in blueprint_instance_names_for_scene:
blueprint = blueprints_data.blueprints_per_name.get(blueprint_name, None) blueprint = blueprints_data.blueprints_per_name.get(blueprint_name, None)

View File

@ -5,7 +5,7 @@ from bpy_types import (Operator)
from bpy.props import (BoolProperty, StringProperty, EnumProperty) from bpy.props import (BoolProperty, StringProperty, EnumProperty)
from .asset_helpers import does_asset_exist, get_user_assets, remove_asset, upsert_asset from .asset_helpers import does_asset_exist, get_user_assets, remove_asset, upsert_asset
from .assets_scan import get_level_scene_assets_tree from .assets_scan import get_main_scene_assets_tree
from ..core.path_helpers import absolute_path_from_blend_file from ..core.path_helpers import absolute_path_from_blend_file
from .generate_asset_file import write_ron_assets_file from .generate_asset_file import write_ron_assets_file
@ -186,8 +186,8 @@ class BLENVY_OT_assets_generate_files(Operator):
blueprints_registry.refresh_blueprints() blueprints_registry.refresh_blueprints()
blueprints_data = blueprints_registry.blueprints_data blueprints_data = blueprints_registry.blueprints_data
for scene in blenvy.level_scenes: for scene in blenvy.main_scenes:
assets_hierarchy = get_level_scene_assets_tree(scene, blueprints_data, settings) assets_hierarchy = get_main_scene_assets_tree(scene, blueprints_data, settings)
# scene["assets"] = json.dumps(assets_hierarchy) # scene["assets"] = json.dumps(assets_hierarchy)
write_ron_assets_file(scene.name, assets_hierarchy, internal_only = False, output_path_full = blenvy.levels_path_full) write_ron_assets_file(scene.name, assets_hierarchy, internal_only = False, output_path_full = blenvy.levels_path_full)

View File

@ -1,6 +1,6 @@
from types import SimpleNamespace from types import SimpleNamespace
import bpy import bpy
from .assets_scan import get_level_scene_assets_tree from .assets_scan import get_main_scene_assets_tree
from .asset_helpers import get_user_assets, does_asset_exist from .asset_helpers import get_user_assets, does_asset_exist
def draw_assets(layout, name, title, asset_registry, target_type, target_name, editable=True, user_assets= [], generated_assets = []): def draw_assets(layout, name, title, asset_registry, target_type, target_name, editable=True, user_assets= [], generated_assets = []):
@ -92,15 +92,15 @@ class BLENVY_PT_assets_panel(bpy.types.Panel):
settings = SimpleNamespace(**settings) settings = SimpleNamespace(**settings)
if panel: if panel:
for scene in blenvy.level_scenes: for scene in blenvy.main_scenes:
user_assets = get_user_assets(scene) user_assets = get_user_assets(scene)
row = panel.row() row = panel.row()
row.prop(scene, "always_export") row.prop(scene, "always_export")
scene_assets_panel = draw_assets(layout=row, name=scene.name, title=f"{scene.name} Assets", asset_registry=asset_registry, user_assets=user_assets, target_type="SCENE", target_name=scene.name) scene_assets_panel = draw_assets(layout=row, name=scene.name, title=f"{scene.name} Assets", asset_registry=asset_registry, user_assets=user_assets, target_type="SCENE", target_name=scene.name)
"""if scene.name in blueprints_data.blueprint_instances_per_level_scene: """if scene.name in blueprints_data.blueprint_instances_per_main_scene:
for blueprint_name in blueprints_data.blueprint_instances_per_level_scene[scene.name].keys(): for blueprint_name in blueprints_data.blueprint_instances_per_main_scene[scene.name].keys():
blueprint = blueprints_data.blueprints_per_name[blueprint_name] blueprint = blueprints_data.blueprints_per_name[blueprint_name]
blueprint_assets = get_user_assets(blueprint.collection) blueprint_assets = get_user_assets(blueprint.collection)
if scene_assets_panel: if scene_assets_panel:

View File

@ -38,7 +38,7 @@ def inject_export_path_into_internal_blueprints(internal_blueprints, blueprints_
blueprint.collection["materials_path"] = materials_exported_path blueprint.collection["materials_path"] = materials_exported_path
def inject_blueprints_list_into_level_scene(scene, blueprints_data, settings): def inject_blueprints_list_into_main_scene(scene, blueprints_data, settings):
project_root_path = getattr(settings, "project_root_path") project_root_path = getattr(settings, "project_root_path")
assets_path = getattr(settings,"assets_path") assets_path = getattr(settings,"assets_path")
levels_path = getattr(settings,"levels_path") levels_path = getattr(settings,"levels_path")
@ -49,7 +49,7 @@ def inject_blueprints_list_into_level_scene(scene, blueprints_data, settings):
assets_list_name = f"assets_list_{scene.name}_components" assets_list_name = f"assets_list_{scene.name}_components"
assets_list_data = {} assets_list_data = {}
blueprint_instance_names_for_scene = blueprints_data.blueprint_instances_per_level_scene.get(scene.name, None) blueprint_instance_names_for_scene = blueprints_data.blueprint_instances_per_main_scene.get(scene.name, None)
blueprint_assets_list = [] blueprint_assets_list = []
if blueprint_instance_names_for_scene: if blueprint_instance_names_for_scene:
for blueprint_name in blueprint_instance_names_for_scene: for blueprint_name in blueprint_instance_names_for_scene:
@ -74,7 +74,7 @@ def inject_blueprints_list_into_level_scene(scene, blueprints_data, settings):
#print("blueprint assets", blueprint_assets_list) #print("blueprint assets", blueprint_assets_list)
def remove_blueprints_list_from_level_scene(scene): def remove_blueprints_list_from_main_scene(scene):
assets_list = None assets_list = None
assets_list_name = f"assets_list_{scene.name}_components" assets_list_name = f"assets_list_{scene.name}_components"

View File

@ -67,6 +67,6 @@ class BlueprintsRegistry(PropertyGroup):
#print("titi", self) #print("titi", self)
blenvy = bpy.context.window_manager.blenvy blenvy = bpy.context.window_manager.blenvy
settings = blenvy settings = blenvy
blueprints_data = blueprints_scan(settings.level_scenes, settings.library_scenes, settings) blueprints_data = blueprints_scan(settings.main_scenes, settings.library_scenes, settings)
self.blueprints_data = blueprints_data self.blueprints_data = blueprints_data
return blueprints_data return blueprints_data

View File

@ -7,14 +7,14 @@ from .blueprint import Blueprint
# - marked as asset # - marked as asset
# - with the "auto_export" flag # - with the "auto_export" flag
# https://blender.stackexchange.com/questions/167878/how-to-get-all-collections-of-the-current-scene # https://blender.stackexchange.com/questions/167878/how-to-get-all-collections-of-the-current-scene
def blueprints_scan(level_scenes, library_scenes, settings): def blueprints_scan(main_scenes, library_scenes, settings):
blueprints = {} blueprints = {}
blueprints_from_objects = {} blueprints_from_objects = {}
blueprint_name_from_instances = {} blueprint_name_from_instances = {}
collections = [] collections = []
# level scenes # main scenes
blueprint_instances_per_level_scene = {} blueprint_instances_per_main_scene = {}
internal_collection_instances = {} internal_collection_instances = {}
external_collection_instances = {} external_collection_instances = {}
@ -26,7 +26,7 @@ def blueprints_scan(level_scenes, library_scenes, settings):
collection_category[collection_name] = [] #.append(collection_name) collection_category[collection_name] = [] #.append(collection_name)
collection_category[collection_name].append(object) collection_category[collection_name].append(object)
for scene in level_scenes:# should it only be level scenes ? what about collection instances inside other scenes ? for scene in main_scenes:# should it only be main scenes ? what about collection instances inside other scenes ?
for object in scene.objects: for object in scene.objects:
#print("object", object.name) #print("object", object.name)
if object.instance_type == 'COLLECTION': if object.instance_type == 'COLLECTION':
@ -52,19 +52,19 @@ def blueprints_scan(level_scenes, library_scenes, settings):
# blueprints[collection_name].instances.append(object) # blueprints[collection_name].instances.append(object)
# FIXME: this only account for direct instances of blueprints, not for any nested blueprint inside a blueprint # FIXME: this only account for direct instances of blueprints, not for any nested blueprint inside a blueprint
if scene.name not in blueprint_instances_per_level_scene.keys(): if scene.name not in blueprint_instances_per_main_scene.keys():
blueprint_instances_per_level_scene[scene.name] = {} blueprint_instances_per_main_scene[scene.name] = {}
if collection_name not in blueprint_instances_per_level_scene[scene.name].keys(): if collection_name not in blueprint_instances_per_main_scene[scene.name].keys():
blueprint_instances_per_level_scene[scene.name][collection_name] = [] blueprint_instances_per_main_scene[scene.name][collection_name] = []
blueprint_instances_per_level_scene[scene.name][collection_name].append(object) blueprint_instances_per_main_scene[scene.name][collection_name].append(object)
blueprint_name_from_instances[object] = collection_name blueprint_name_from_instances[object] = collection_name
"""# add any indirect ones """# add any indirect ones
# FIXME: needs to be recursive, either here or above # FIXME: needs to be recursive, either here or above
for nested_blueprint in blueprints[collection_name].nested_blueprints: for nested_blueprint in blueprints[collection_name].nested_blueprints:
if not nested_blueprint in blueprint_instances_per_level_scene[scene.name]: if not nested_blueprint in blueprint_instances_per_main_scene[scene.name]:
blueprint_instances_per_level_scene[scene.name].append(nested_blueprint)""" blueprint_instances_per_main_scene[scene.name].append(nested_blueprint)"""
for collection in bpy.data.collections: for collection in bpy.data.collections:
#print("collection", collection, collection.name_full, "users", collection.users) #print("collection", collection, collection.name_full, "users", collection.users)
@ -83,7 +83,7 @@ def blueprints_scan(level_scenes, library_scenes, settings):
if ( if (
'AutoExport' in collection and collection['AutoExport'] == True # get marked collections 'AutoExport' in collection and collection['AutoExport'] == True # get marked collections
or collection.asset_data is not None # or if you have marked collections as assets you can auto export them too or collection.asset_data is not None # or if you have marked collections as assets you can auto export them too
or collection.name in list(internal_collection_instances.keys()) # or if the collection has an instance in one of the level scenes or collection.name in list(internal_collection_instances.keys()) # or if the collection has an instance in one of the main scenes
): ):
blueprint = Blueprint(collection.name) blueprint = Blueprint(collection.name)
blueprint.local = True blueprint.local = True
@ -108,7 +108,7 @@ def blueprints_scan(level_scenes, library_scenes, settings):
# #
collections.append(collection) collections.append(collection)
# EXTERNAL COLLECTIONS: add any collection that has an instance in the level scenes, but is not present in any of the scenes (IE NON LOCAL/ EXTERNAL) # EXTERNAL COLLECTIONS: add any collection that has an instance in the main scenes, but is not present in any of the scenes (IE NON LOCAL/ EXTERNAL)
for collection_name in external_collection_instances: for collection_name in external_collection_instances:
collection = bpy.data.collections[collection_name] collection = bpy.data.collections[collection_name]
blueprint = Blueprint(collection.name) blueprint = Blueprint(collection.name)
@ -165,7 +165,7 @@ def blueprints_scan(level_scenes, library_scenes, settings):
print(blueprints_from_objects)""" print(blueprints_from_objects)"""
print("BLUEPRINT INSTANCES PER MAIN SCENE") print("BLUEPRINT INSTANCES PER MAIN SCENE")
print(blueprint_instances_per_level_scene)''' print(blueprint_instances_per_main_scene)'''
"""changes_test = {'Library': { """changes_test = {'Library': {
@ -173,11 +173,11 @@ def blueprints_scan(level_scenes, library_scenes, settings):
'Fox_mesh': bpy.data.objects['Fox_mesh'], 'Fox_mesh': bpy.data.objects['Fox_mesh'],
'External_blueprint2_Cylinder': bpy.data.objects['External_blueprint2_Cylinder']} 'External_blueprint2_Cylinder': bpy.data.objects['External_blueprint2_Cylinder']}
} }
# which level scene has been impacted by this # which main scene has been impacted by this
# does one of the level scenes contain an INSTANCE of an impacted blueprint # does one of the main scenes contain an INSTANCE of an impacted blueprint
for scene in level_scenes: for scene in main_scenes:
changed_objects = list(changes_test["Library"].keys()) # just a hack for testing changed_objects = list(changes_test["Library"].keys()) # just a hack for testing
#bluprint_instances_in_scene = blueprint_instances_per_level_scene[scene.name] #bluprint_instances_in_scene = blueprint_instances_per_main_scene[scene.name]
#print("instances per scene", bluprint_instances_in_scene, "changed_objects", changed_objects) #print("instances per scene", bluprint_instances_in_scene, "changed_objects", changed_objects)
changed_blueprints_with_instances_in_scene = [blueprints_from_objects[changed] for changed in changed_objects if changed in blueprints_from_objects] changed_blueprints_with_instances_in_scene = [blueprints_from_objects[changed] for changed in changed_objects if changed in blueprints_from_objects]
@ -226,7 +226,7 @@ def blueprints_scan(level_scenes, library_scenes, settings):
"external_blueprints": external_blueprints, "external_blueprints": external_blueprints,
"blueprints_per_scenes": blueprints_per_scenes, "blueprints_per_scenes": blueprints_per_scenes,
"blueprint_instances_per_level_scene": blueprint_instances_per_level_scene, "blueprint_instances_per_main_scene": blueprint_instances_per_main_scene,
"blueprint_instances_per_library_scene": blueprint_instances_per_library_scene, "blueprint_instances_per_library_scene": blueprint_instances_per_library_scene,
# not sure about these two # not sure about these two

View File

@ -8,14 +8,14 @@ import blenvy.add_ons.bevy_components.settings as component_settings
# list of settings we do NOT want to save # list of settings we do NOT want to save
settings_black_list = ['settings_save_enabled', 'level_scene_selector', 'library_scene_selector'] settings_black_list = ['settings_save_enabled', 'main_scene_selector', 'library_scene_selector']
def save_settings(settings, context): def save_settings(settings, context):
if settings.settings_save_enabled: if settings.settings_save_enabled:
settings_dict = generate_complete_settings_dict(settings, BlenvyManager, []) settings_dict = generate_complete_settings_dict(settings, BlenvyManager, [])
raw_settings = {key: settings_dict[key] for key in settings_dict.keys() if key not in settings_black_list} raw_settings = {key: settings_dict[key] for key in settings_dict.keys() if key not in settings_black_list}
# we need to inject the main & library scene names as they are computed properties, not blender ones # we need to inject the main & library scene names as they are computed properties, not blender ones
raw_settings['level_scenes_names'] = settings.level_scenes_names raw_settings['main_scenes_names'] = settings.main_scenes_names
raw_settings['library_scenes_names'] = settings.library_scenes_names raw_settings['library_scenes_names'] = settings.library_scenes_names
upsert_settings(settings.settings_save_path, raw_settings, overwrite=True) upsert_settings(settings.settings_save_path, raw_settings, overwrite=True)
@ -29,10 +29,10 @@ def update_asset_folders(settings, context):
def is_scene_already_in_use(self, scene): def is_scene_already_in_use(self, scene):
try: try:
current_level_scene_names = list(map(lambda x: x.name, self.level_scenes)) current_main_scene_names = list(map(lambda x: x.name, self.main_scenes))
current_library_scene_names = list(map(lambda x: x.name, self.library_scenes)) current_library_scene_names = list(map(lambda x: x.name, self.library_scenes))
#print("scene ", scene.name, current_level_scene_names, current_library_scene_names) #print("scene ", scene.name, current_main_scene_names, current_library_scene_names)
return scene.name not in current_level_scene_names and scene.name not in current_library_scene_names return scene.name not in current_main_scene_names and scene.name not in current_library_scene_names
except: except:
return True return True
@ -54,17 +54,6 @@ class BlenvyManager(PropertyGroup):
update=save_settings update=save_settings
) # type: ignore ) # type: ignore
config_mode: EnumProperty(
items=(
('COMMON', "Common", "Switch to common configuration"),
('COMPONENTS', "Components", "Switch to components configuration"),
('EXPORT', "Export", "Switch to export configuration"),
),
default="COMMON",
update=save_settings
) # type: ignore
project_root_path: StringProperty( project_root_path: StringProperty(
name = "Project Root Path", name = "Project Root Path",
description="The root folder of your (Bevy) project (not assets!)", description="The root folder of your (Bevy) project (not assets!)",
@ -104,7 +93,7 @@ class BlenvyManager(PropertyGroup):
levels_path: StringProperty( levels_path: StringProperty(
name='Levels path', name='Levels path',
description='path to export the levels (level scenes) to (relative to the assets folder)', description='path to export the levels (main scenes) to (relative to the assets folder)',
default='levels', default='levels',
update= save_settings update= save_settings
) # type: ignore ) # type: ignore
@ -130,16 +119,16 @@ class BlenvyManager(PropertyGroup):
auto_export: PointerProperty(type=auto_export_settings.AutoExportSettings) # type: ignore auto_export: PointerProperty(type=auto_export_settings.AutoExportSettings) # type: ignore
components: PointerProperty(type=component_settings.ComponentsSettings) # type: ignore components: PointerProperty(type=component_settings.ComponentsSettings) # type: ignore
level_scene_selector: PointerProperty(type=bpy.types.Scene, name="level scene", description="level scene picker", poll=is_scene_already_in_use, update=save_settings)# type: ignore main_scene_selector: PointerProperty(type=bpy.types.Scene, name="main scene", description="main_scene_picker", poll=is_scene_already_in_use, update=save_settings)# type: ignore
library_scene_selector: PointerProperty(type=bpy.types.Scene, name="library scene", description="library scene picker", poll=is_scene_already_in_use, update=save_settings)# type: ignore library_scene_selector: PointerProperty(type=bpy.types.Scene, name="library scene", description="library_scene_picker", poll=is_scene_already_in_use, update=save_settings)# type: ignore
@property @property
def level_scenes(self): def main_scenes(self):
return [scene for scene in bpy.data.scenes if scene.blenvy_scene_type == 'Level'] return [scene for scene in bpy.data.scenes if scene.blenvy_scene_type == 'Level']
@property @property
def level_scenes_names(self): def main_scenes_names(self):
return [scene.name for scene in self.level_scenes] return [scene.name for scene in self.main_scenes]
@property @property
def library_scenes(self): def library_scenes(self):
@ -159,7 +148,7 @@ class BlenvyManager(PropertyGroup):
bpy.types.Scene.blenvy_scene_type = EnumProperty( bpy.types.Scene.blenvy_scene_type = EnumProperty(
items= ( items= (
('None', 'None', 'No blenvy type specified'), ('None', 'None', 'No blenvy type specified'),
('Level', 'Level','Level scene'), ('Level', 'Level','Main/ Level scene'),
('Library', 'Library', 'Library scene'), ('Library', 'Library', 'Library scene'),
), ),
default='None' default='None'

View File

@ -1,8 +1,11 @@
from bpy_types import (Operator) from bpy_types import (Operator)
from bpy.props import (EnumProperty) from bpy.props import (EnumProperty)
class BLENVY_OT_tooling_switch(Operator): class BLENVY_OT_tooling_switch(Operator):
"""Switch blenvy tooling""" """Switch bevy tooling"""
bl_idname = "bevy.tooling_switch" bl_idname = "bevy.tooling_switch"
bl_label = "Switch bevy tooling" bl_label = "Switch bevy tooling"
#bl_options = {} #bl_options = {}
@ -26,27 +29,3 @@ class BLENVY_OT_tooling_switch(Operator):
context.window_manager.blenvy.mode = self.tool context.window_manager.blenvy.mode = self.tool
return {'FINISHED'} return {'FINISHED'}
class BLENVY_OT_configuration_switch(Operator):
"""Switch tooling configuration"""
bl_idname = "bevy.config_switch"
bl_label = "Switch blenvy configuration"
#bl_options = {}
tool: EnumProperty(
items=(
('COMMON', "Common", "Switch to common configuration"),
('COMPONENTS', "Components", "Switch to components configuration"),
('EXPORT', "Export", "Switch to export configuration"),
)
) # type: ignore
@classmethod
def description(cls, context, properties):
return properties.tool
def execute(self, context):
context.window_manager.blenvy.config_mode = self.tool
return {'FINISHED'}

View File

@ -28,8 +28,8 @@ class BLENVY_OT_scenes_list_actions(Operator):
def invoke(self, context, event): def invoke(self, context, event):
if self.action == 'REMOVE': if self.action == 'REMOVE':
bpy.data.scenes[self.scene_name].blenvy_scene_type = 'None' bpy.data.scenes[self.scene_name].blenvy_scene_type = 'None'
context.window_manager.blenvy.level_scene_selector = None # we use these to force update/save the list of level/library scenes context.window_manager.blenvy.main_scene_selector = None # we use these to force update/save the list of main/library scenes
context.window_manager.blenvy.library_scene_selector = None # we use these to force update/save the list of level/library scenes context.window_manager.blenvy.library_scene_selector = None # we use these to force update/save the list of main/library scenes
"""info = 'Item "%s" removed from list' % (target[idx].name) """info = 'Item "%s" removed from list' % (target[idx].name)
target.remove(idx) target.remove(idx)
@ -39,8 +39,8 @@ class BLENVY_OT_scenes_list_actions(Operator):
if self.action == 'ADD': if self.action == 'ADD':
scene_to_add = None scene_to_add = None
if self.scene_type == "LEVEL": if self.scene_type == "LEVEL":
if context.window_manager.blenvy.level_scene_selector: if context.window_manager.blenvy.main_scene_selector:
scene_to_add = context.window_manager.blenvy.level_scene_selector scene_to_add = context.window_manager.blenvy.main_scene_selector
scene_to_add.blenvy_scene_type = "Level" scene_to_add.blenvy_scene_type = "Level"
else: else:
if context.window_manager.blenvy.library_scene_selector: if context.window_manager.blenvy.library_scene_selector:
@ -51,9 +51,9 @@ class BLENVY_OT_scenes_list_actions(Operator):
print("adding scene", scene_to_add) print("adding scene", scene_to_add)
if self.scene_type == "LEVEL": if self.scene_type == "LEVEL":
context.window_manager.blenvy.level_scene_selector = None # we use these to force update/save the list of level/library scenes context.window_manager.blenvy.main_scene_selector = None # we use these to force update/save the list of main/library scenes
else: else:
context.window_manager.blenvy.library_scene_selector = None # we use these to force update/save the list of level/library scenes context.window_manager.blenvy.library_scene_selector = None # we use these to force update/save the list of main/library scenes
#setattr(source, target_index, len(target) - 1) #setattr(source, target_index, len(target) - 1)

View File

@ -72,82 +72,50 @@ class BLENVY_PT_SidePanel(bpy.types.Panel):
tool_switch_components = target.operator(operator="bevy.tooling_switch", text="", icon="TOOL_SETTINGS") tool_switch_components = target.operator(operator="bevy.tooling_switch", text="", icon="TOOL_SETTINGS")
tool_switch_components.tool = "TOOLS" tool_switch_components.tool = "TOOLS"
# Debug stuff
"""layout.label(text="Active Blueprint: "+ active_collection.name.upper())
layout.label(text="World scene active: "+ str(world_scene_active))
layout.label(text="Library scene active: "+ str(library_scene_active))
layout.label(text=blenvy.mode)"""
if blenvy.mode == "SETTINGS": if blenvy.mode == "SETTINGS":
config_mode = blenvy.config_mode
row = layout.row()
config_target = row.box() if config_mode == 'COMMON' else row
config_switch = config_target.operator(operator="bevy.config_switch", text="", icon="OPTIONS")
config_switch.tool = "COMMON"
config_target = row.box() if config_mode == 'COMPONENTS' else row
config_switch = config_target.operator(operator="bevy.config_switch", text="", icon="PROPERTIES")
config_switch.tool = "COMPONENTS"
config_target = row.box() if config_mode == 'EXPORT' else row
config_switch = config_target.operator(operator="bevy.config_switch", text="", icon="EXPORT")
config_switch.tool = "EXPORT"
if config_mode == 'COMMON':
header, panel = layout.panel("common", default_closed=False) header, panel = layout.panel("common", default_closed=False)
header.label(text="Common") header.label(text="Common")
if panel: if panel:
draw_common_settings_ui(panel, blenvy) row = panel.row()
if config_mode == 'COMPONENTS':
header, panel = layout.panel("components", default_closed=False)
header.label(text="Components")
if panel:
components_ui.draw_settings_ui(panel, blenvy.components)
if config_mode == 'EXPORT':
header, panel = layout.panel("auto_export", default_closed=False)
header.label(text="Export")
if panel:
auto_export_ui.draw_settings_ui(panel, blenvy.auto_export)
def draw_common_settings_ui(layout, settings):
blenvy = settings
row = layout.row()
draw_folder_browser(layout=row, label="Root Folder", prop_origin=blenvy, target_property="project_root_path") draw_folder_browser(layout=row, label="Root Folder", prop_origin=blenvy, target_property="project_root_path")
row = layout.row() row = panel.row()
draw_folder_browser(layout=row, label="Assets Folder", prop_origin=blenvy, target_property="assets_path") draw_folder_browser(layout=row, label="Assets Folder", prop_origin=blenvy, target_property="assets_path")
row = layout.row() row = panel.row()
draw_folder_browser(layout=row, label="Blueprints Folder", prop_origin=blenvy, target_property="blueprints_path") draw_folder_browser(layout=row, label="Blueprints Folder", prop_origin=blenvy, target_property="blueprints_path")
row = layout.row() row = panel.row()
draw_folder_browser(layout=row, label="Levels Folder", prop_origin=blenvy, target_property="levels_path") draw_folder_browser(layout=row, label="Levels Folder", prop_origin=blenvy, target_property="levels_path")
row = layout.row() row = panel.row()
draw_folder_browser(layout=row, label="Materials Folder", prop_origin=blenvy, target_property="materials_path") draw_folder_browser(layout=row, label="Materials Folder", prop_origin=blenvy, target_property="materials_path")
layout.separator() panel.separator()
# scenes selection # scenes selection
if len(blenvy.level_scenes) == 0 and len(blenvy.library_scenes) == 0: if len(blenvy.main_scenes) == 0 and len(blenvy.library_scenes) == 0:
row = layout.row() row = panel.row()
row.alert = True row.alert = True
layout.alert = True panel.alert = True
row.label(text="NO library or level scenes specified! at least one level scene or library scene is required!") row.label(text="NO library or main scenes specified! at least one main scene or library scene is required!")
row = layout.row() row = panel.row()
row.label(text="Please select and add at least one:") row.label(text="Please select and add one using the UI below")
section = layout section = panel
rows = 2 rows = 2
row = section.row() row = section.row()
col = row.column() row.label(text="main scenes")
col.label(text="level scenes") row.prop(blenvy, "main_scene_selector", text='')
col = row.column() add_operator = row.operator("blenvy.scenes_list_actions", icon='ADD', text="")
col.prop(blenvy, "level_scene_selector", text='')
col = row.column()
add_operator = col.operator("blenvy.scenes_list_actions", icon='ADD', text="")
add_operator.action = 'ADD' add_operator.action = 'ADD'
add_operator.scene_type = 'LEVEL' add_operator.scene_type = 'LEVEL'
col.enabled = blenvy.level_scene_selector is not None #sub_row.enabled = blenvy.main_scene_selector is not None
row = section.row() row = section.row()
col = row.column() col = row.column()
for scene in blenvy.level_scenes: for scene in blenvy.main_scenes:
sub_row = col.box().row() sub_row = col.box().row()
sub_row.label(text=scene.name) sub_row.label(text=scene.name)
remove_operator = sub_row.operator("blenvy.scenes_list_actions", icon='TRASH', text="") remove_operator = sub_row.operator("blenvy.scenes_list_actions", icon='TRASH', text="")
@ -159,16 +127,11 @@ def draw_common_settings_ui(layout, settings):
# library scenes # library scenes
row = section.row() row = section.row()
row.label(text="library scenes")
col = row.column() row.prop(blenvy, "library_scene_selector", text='')
col.label(text="library scenes") add_operator = row.operator("blenvy.scenes_list_actions", icon='ADD', text="")
col = row.column()
col.prop(blenvy, "library_scene_selector", text='')
col = row.column()
add_operator = col.operator("blenvy.scenes_list_actions", icon='ADD', text="")
add_operator.action = 'ADD' add_operator.action = 'ADD'
add_operator.scene_type = 'LIBRARY' add_operator.scene_type = 'LIBRARY'
col.enabled = blenvy.library_scene_selector is not None
row = section.row() row = section.row()
col = row.column() col = row.column()
@ -180,3 +143,15 @@ def draw_common_settings_ui(layout, settings):
remove_operator.scene_type = 'LEVEL' remove_operator.scene_type = 'LEVEL'
remove_operator.scene_name = scene.name remove_operator.scene_name = scene.name
col.separator() col.separator()
header, panel = layout.panel("components", default_closed=False)
header.label(text="Components")
if panel:
components_ui.draw_settings_ui(panel, blenvy.components)
header, panel = layout.panel("auto_export", default_closed=False)
header.label(text="Auto Export")
if panel:
auto_export_ui.draw_settings_ui(panel, blenvy.auto_export)

Binary file not shown.

Before

Width:  |  Height:  |  Size: 42 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 30 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 54 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 26 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 15 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 21 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 66 KiB

View File

@ -27,7 +27,7 @@ class BLENVY_PT_levels_panel(bpy.types.Panel):
#blueprints_registry.refresh_blueprints() #blueprints_registry.refresh_blueprints()
blueprints_data = blueprints_registry.blueprints_data blueprints_data = blueprints_registry.blueprints_data
for scene in blenvy.level_scenes: for scene in blenvy.main_scenes:
header, panel = layout.box().panel(f"level_assets{scene.name}", default_closed=False) header, panel = layout.box().panel(f"level_assets{scene.name}", default_closed=False)
if header: if header:
header.label(text=scene.name)#, icon="HIDE_OFF") header.label(text=scene.name)#, icon="HIDE_OFF")

View File

@ -98,23 +98,23 @@ This issue has been resolved in v0.9.
### Blueprints ### Blueprints
You can enable this option to automatically replace all the **collection instances** inside your level scene with blueprints You can enable this option to automatically replace all the **collection instances** inside your main scene with blueprints
- whenever you change your level scene (or your library scene , if that option is enabled), all your collection instances - whenever you change your main scene (or your library scene , if that option is enabled), all your collection instances
* will be replaced with empties (this will not be visible to you) * will be replaced with empties (this will not be visible to you)
* those empties will have additional custom properties / components : ```BlueprintInfo``` & ```SpawnBlueprint``` * those empties will have additional custom properties / components : ```BlueprintInfo``` & ```SpawnBlueprint```
* your level scene/ level will be exported to a much more trimmed down gltf file (see next point) * your main scene/ level will be exported to a much more trimmed down gltf file (see next point)
* all the original collections (that you used to create the instances) will be exported as **seperate gltf files** into the "library" folder * all the original collections (that you used to create the instances) will be exported as **seperate gltf files** into the "library" folder
- this means you will have - this means you will have
* one small main gltf file (your level/world) * one small main gltf file (your level/world)
* as many gltf files as you have used collections in the level scene , in the library path you specified : * as many gltf files as you have used collections in the main scene , in the library path you specified :
for the included [basic](../../examples/bevy_gltf_blueprints/basic/) example's [assets](../../examples/bevy_gltf_blueprints/basic/assets/), it looks something like this: for the included [basic](../../examples/bevy_gltf_blueprints/basic/) example's [assets](../../examples/bevy_gltf_blueprints/basic/assets/), it looks something like this:
![library](./docs/exported_library_files.png) ![library](./docs/exported_library_files.png)
the .blend file that they are generated from can be found [here](../../examples/bevy_gltf_blueprints/basic/assets/advanced.blend) the .blend file that they are generated from can be found [here](../../examples/bevy_gltf_blueprints/basic/assets/advanced.blend)
- the above only applies to collections that have **instances** in your level scene! - the above only applies to collections that have **instances** in your main scene!
if you want a specific collection in your library to always get exported regardless of its use, you need to add if you want a specific collection in your library to always get exported regardless of its use, you need to add
a **COLLECTION** (boolean) custom property called ```AutoExport``` set to true a **COLLECTION** (boolean) custom property called ```AutoExport``` set to true
> not at the object level ! the collection level ! > not at the object level ! the collection level !

View File

@ -63,9 +63,9 @@ def test_export_change_tracking_custom_properties(setup_data):
prepare_auto_export() prepare_auto_export()
def first_change(): def first_change():
# now add a custom property to the cube in the level scene & export again # now add a custom property to the cube in the main scene & export again
print("----------------") print("----------------")
print("level scene change (custom property)") print("main scene change (custom property)")
print("----------------") print("----------------")
bpy.data.objects["Cube"]["test_property"] = 42 bpy.data.objects["Cube"]["test_property"] = 42
@ -86,7 +86,7 @@ def test_export_change_tracking_custom_properties_collection_instances_combine_m
def second_change(): def second_change():
# add a custom property to the cube in the library scene & export again # add a custom property to the cube in the library scene & export again
# this should trigger changes in the level scene as well since the mode is embed & this blueprints has an instance in the level scene # this should trigger changes in the main scene as well since the mode is embed & this blueprints has an instance in the main scene
print("----------------") print("----------------")
print("library change (custom property)") print("library change (custom property)")
print("----------------") print("----------------")
@ -94,7 +94,7 @@ def test_export_change_tracking_custom_properties_collection_instances_combine_m
def third_change(): def third_change():
# now we set the _combine mode of the instance to "split", so auto_export should: # now we set the _combine mode of the instance to "split", so auto_export should:
# * not take the changes into account in the level scene # * not take the changes into account in the main scene
# * export the blueprint (so file for Blueprint1 will be changed) # * export the blueprint (so file for Blueprint1 will be changed)
bpy.data.objects["Blueprint1"]["_combine"] = "Split" bpy.data.objects["Blueprint1"]["_combine"] = "Split"
@ -117,9 +117,9 @@ def test_export_change_tracking_light_properties(setup_data):
prepare_auto_export() prepare_auto_export()
def first_change(): def first_change():
# now add a custom property to the cube in the level scene & export again # now add a custom property to the cube in the main scene & export again
print("----------------") print("----------------")
print("level scene change (light, energy)") print("main scene change (light, energy)")
print("----------------") print("----------------")
bpy.data.lights["Light"].energy = 100 bpy.data.lights["Light"].energy = 100
@ -128,14 +128,14 @@ def test_export_change_tracking_light_properties(setup_data):
def second_change(): def second_change():
print("----------------") print("----------------")
print("level scene change (light, shadow_cascade_count)") print("main scene change (light, shadow_cascade_count)")
print("----------------") print("----------------")
bpy.data.lights["Light"].shadow_cascade_count = 2 bpy.data.lights["Light"].shadow_cascade_count = 2
def third_change(): def third_change():
print("----------------") print("----------------")
print("level scene change (light, use_shadow)") print("main scene change (light, use_shadow)")
print("----------------") print("----------------")
bpy.data.lights["Light"].use_shadow = False bpy.data.lights["Light"].use_shadow = False
@ -153,7 +153,7 @@ def test_export_change_tracking_camera_properties(setup_data):
def first_change(): def first_change():
print("----------------") print("----------------")
print("level scene change (camera)") print("main scene change (camera)")
print("----------------") print("----------------")
bpy.data.cameras["Camera"].angle = 0.5 bpy.data.cameras["Camera"].angle = 0.5
@ -170,20 +170,20 @@ def test_export_change_tracking_material_properties(setup_data):
def first_change(): def first_change():
print("----------------") print("----------------")
print("level scene change (material, clip)") print("main scene change (material, clip)")
print("----------------") print("----------------")
bpy.data.materials["Material.001"].blend_method = 'CLIP' bpy.data.materials["Material.001"].blend_method = 'CLIP'
def second_change(): def second_change():
print("----------------") print("----------------")
print("level scene change (material, alpha_threshold)") print("main scene change (material, alpha_threshold)")
print("----------------") print("----------------")
bpy.data.materials["Material.001"].alpha_threshold = 0.2 bpy.data.materials["Material.001"].alpha_threshold = 0.2
def third_change(): def third_change():
print("----------------") print("----------------")
print("level scene change (material, diffuse_color)") print("main scene change (material, diffuse_color)")
print("----------------") print("----------------")
bpy.data.materials["Material.001"].diffuse_color[0] = 0.2 bpy.data.materials["Material.001"].diffuse_color[0] = 0.2
@ -200,7 +200,7 @@ def test_export_change_tracking_material_properties(setup_data):
- setup gltf parameters & auto_export parameters - setup gltf parameters & auto_export parameters
- calls exporter on the testing scene - calls exporter on the testing scene
- saves timestamps of generated files - saves timestamps of generated files
- changes things in the level scene and/or library - changes things in the main scene and/or library
- checks if timestamps have changed - checks if timestamps have changed
- if all worked => test is a-ok - if all worked => test is a-ok
- removes generated files - removes generated files
@ -219,7 +219,7 @@ def test_export_various_chained_changes(setup_data):
def second_change(): def second_change():
# now move the main cube & export again # now move the main cube & export again
print("----------------") print("----------------")
print("level scene change") print("main scene change")
print("----------------") print("----------------")
bpy.context.window_manager.auto_export_tracker.enable_change_detection() # FIXME: should not be needed, but .. bpy.context.window_manager.auto_export_tracker.enable_change_detection() # FIXME: should not be needed, but ..

View File

@ -92,7 +92,7 @@ def test_export_no_parameters(setup_data):
def test_export_auto_export_parameters_only(setup_data): def test_export_auto_export_parameters_only(setup_data):
auto_export_operator = bpy.ops.export_scenes.auto_gltf auto_export_operator = bpy.ops.export_scenes.auto_gltf
export_props = { export_props = {
"level_scene_names" : ['World'], "main_scene_names" : ['World'],
"library_scene_names": ['Library'], "library_scene_names": ['Library'],
} }
@ -119,7 +119,7 @@ def test_export_changed_parameters(setup_data):
# first, configure things # first, configure things
# we use the global settings for that # we use the global settings for that
export_props = { export_props = {
"level_scene_names" : ['World'], "main_scene_names" : ['World'],
"library_scene_names": ['Library'], "library_scene_names": ['Library'],
} }
@ -216,7 +216,7 @@ def test_export_changed_parameters(setup_data):
print("fourth export, changed auto parameters") print("fourth export, changed auto parameters")
export_props = { export_props = {
"level_scene_names" : ['World'], "main_scene_names" : ['World'],
"library_scene_names": ['Library'], "library_scene_names": ['Library'],
"export_materials_library": False # we need to add it here, as the direct settings set on the operator will only be used for the NEXT run "export_materials_library": False # we need to add it here, as the direct settings set on the operator will only be used for the NEXT run
} }

View File

@ -70,7 +70,7 @@ def test_export_do_not_export_blueprints(setup_data):
# first, configure things # first, configure things
# we use the global settings for that # we use the global settings for that
export_props = { export_props = {
"level_scene_names" : ['World'], "main_scene_names" : ['World'],
"library_scene_names": ['Library'] "library_scene_names": ['Library']
} }
stored_auto_settings = bpy.data.texts[".gltf_auto_export_settings"] if ".gltf_auto_export_settings" in bpy.data.texts else bpy.data.texts.new(".gltf_auto_export_settings") stored_auto_settings = bpy.data.texts[".gltf_auto_export_settings"] if ".gltf_auto_export_settings" in bpy.data.texts else bpy.data.texts.new(".gltf_auto_export_settings")
@ -96,7 +96,7 @@ def test_export_custom_blueprints_path(setup_data):
# first, configure things # first, configure things
# we use the global settings for that # we use the global settings for that
export_props = { export_props = {
"level_scene_names" : ['World'], "main_scene_names" : ['World'],
"library_scene_names": ['Library'] "library_scene_names": ['Library']
} }
@ -123,7 +123,7 @@ def test_export_materials_library(setup_data):
# first, configure things # first, configure things
# we use the global settings for that # we use the global settings for that
export_props = { export_props = {
"level_scene_names" : ['World'], "main_scene_names" : ['World'],
"library_scene_names": ['Library'] "library_scene_names": ['Library']
} }
stored_settings = bpy.data.texts[".gltf_auto_export_settings"] if ".gltf_auto_export_settings" in bpy.data.texts else bpy.data.texts.new(".gltf_auto_export_settings") stored_settings = bpy.data.texts[".gltf_auto_export_settings"] if ".gltf_auto_export_settings" in bpy.data.texts else bpy.data.texts.new(".gltf_auto_export_settings")
@ -150,7 +150,7 @@ def test_export_materials_library_custom_path(setup_data):
# first, configure things # first, configure things
# we use the global settings for that # we use the global settings for that
export_props = { export_props = {
"level_scene_names" : ['World'], "main_scene_names" : ['World'],
"library_scene_names": ['Library'] "library_scene_names": ['Library']
} }
stored_settings = bpy.data.texts[".gltf_auto_export_settings"] if ".gltf_auto_export_settings" in bpy.data.texts else bpy.data.texts.new(".gltf_auto_export_settings") stored_settings = bpy.data.texts[".gltf_auto_export_settings"] if ".gltf_auto_export_settings" in bpy.data.texts else bpy.data.texts.new(".gltf_auto_export_settings")
@ -179,7 +179,7 @@ def test_export_collection_instances_combine_mode(setup_data): # There is more i
# first, configure things # first, configure things
# we use the global settings for that # we use the global settings for that
export_props = { export_props = {
"level_scene_names" : ['World'], "main_scene_names" : ['World'],
"library_scene_names": ['Library'] "library_scene_names": ['Library']
} }
stored_settings = bpy.data.texts[".gltf_auto_export_settings"] if ".gltf_auto_export_settings" in bpy.data.texts else bpy.data.texts.new(".gltf_auto_export_settings") stored_settings = bpy.data.texts[".gltf_auto_export_settings"] if ".gltf_auto_export_settings" in bpy.data.texts else bpy.data.texts.new(".gltf_auto_export_settings")
@ -209,7 +209,7 @@ def test_export_do_not_export_marked_assets(setup_data):
# first, configure things # first, configure things
# we use the global settings for that # we use the global settings for that
export_props = { export_props = {
"level_scene_names" : ['World'], "main_scene_names" : ['World'],
"library_scene_names": ['Library'] "library_scene_names": ['Library']
} }
stored_settings = bpy.data.texts[".gltf_auto_export_settings"] if ".gltf_auto_export_settings" in bpy.data.texts else bpy.data.texts.new(".gltf_auto_export_settings") stored_settings = bpy.data.texts[".gltf_auto_export_settings"] if ".gltf_auto_export_settings" in bpy.data.texts else bpy.data.texts.new(".gltf_auto_export_settings")
@ -239,7 +239,7 @@ def test_export_separate_dynamic_and_static_objects(setup_data):
# first, configure things # first, configure things
# we use the global settings for that # we use the global settings for that
export_props = { export_props = {
"level_scene_names" : ['World'], "main_scene_names" : ['World'],
"library_scene_names": ['Library'] "library_scene_names": ['Library']
} }
stored_settings = bpy.data.texts[".gltf_auto_export_settings"] if ".gltf_auto_export_settings" in bpy.data.texts else bpy.data.texts.new(".gltf_auto_export_settings") stored_settings = bpy.data.texts[".gltf_auto_export_settings"] if ".gltf_auto_export_settings" in bpy.data.texts else bpy.data.texts.new(".gltf_auto_export_settings")
@ -270,7 +270,7 @@ def test_export_should_not_generate_orphan_data(setup_data):
# first, configure things # first, configure things
# we use the global settings for that # we use the global settings for that
export_props = { export_props = {
"level_scene_names" : ['World'], "main_scene_names" : ['World'],
"library_scene_names": ['Library'] "library_scene_names": ['Library']
} }
stored_settings = bpy.data.texts[".gltf_auto_export_settings"] if ".gltf_auto_export_settings" in bpy.data.texts else bpy.data.texts.new(".gltf_auto_export_settings") stored_settings = bpy.data.texts[".gltf_auto_export_settings"] if ".gltf_auto_export_settings" in bpy.data.texts else bpy.data.texts.new(".gltf_auto_export_settings")

View File

@ -8,7 +8,7 @@ def prepare_auto_export(auto_export_overrides={}, gltf_export_settings = {"expor
# first, configure things # first, configure things
# we use the global settings for that # we use the global settings for that
export_props = { export_props = {
"level_scene_names" : ['World'], "main_scene_names" : ['World'],
"library_scene_names": ['Library'], "library_scene_names": ['Library'],
**auto_export_overrides **auto_export_overrides
} }