🧪 Knowledge sharing "Running Modular Avatar on other apps"
A new knowledge sharing article has been published, "Running Modular Avatar on other apps".
A new knowledge sharing article has been published, "Running Modular Avatar on other apps".
The Starmesh Op. Transfer Blendshape component attempts to create a blendshape on a costume that mimics the movement of a blendshape from another mesh.
Try using this to transfer nail deformations, chest deformations, or hip deformations (i.e. Hip_big
) from a base mesh to a costume.
Combine it with other Selectors to limit the areas affected by the transferred deformations.
This component may not always produce good results.
In addition, if you use other tools that attempt to fit a costume on an avatar it was not designed for (i.e. fitting a costume made for Manuka on a Lime base body), it will probably not work as this operator expects the mesh data to overlap, regardless of how bones are arranged in the scene.
The Animator As Code V1 API has been updated to improve support for third-party asset container management.
AacConfiguration.AssetContainerProvider
to specify an asset container provider.IAacAssetContainerProvider
to abstract asset container management.This change has been contributed by kb10uy (KOBAYASHI Yū) (first contribution).
The NDMF example in the Getting started page has been updated to demonstrate integration with this new API.
As specified in the changelog for the official release of Animator As Code V1, the breaking changes that were planned for V1.2.0 have been applied.
These breaking changes are meant to be the last breaking changes for the lifetime of Animator As Code V1.
For more details, see the full changelog.
The sub-packages of Animator As Code "VRChat", "VRChat Destructive Workflow", and "Modular Avatar functions" have been updated to allow installation with Animator As Code V1.2.0 and all future versions of Animator As Code V1.
There are no other changes.
A new version of Starmesh has been released.
Starmesh now has code that is shared with FaceTra Shape Creator under the MeshLib package; this is the main reason this update has been published.
This update contains fixes in the order Selectors and Operators are processed, so that Op. Paint New Bone does not interfere with operators that use Select Bones.
A new version of FaceTra Shape Creator has been released.
There are a lot of changes, described in full in the changelog.
In summary:
FaceTra now shares code with Starmesh under the MeshLib package, which is included in both products.
If you use Starmesh, please update Starmesh to V1.6.0.
Hello!
I've gone through a lot of projects lately, and here's a summary of what has been happening.
My development time is divided between the maintenance of Free tools, Patreon exclusive tools, and the continuation of research.
(This page is a clone of the post published on Patreon)
☀️ Animator As Code V1 has been released. This update brings Sub-State Machines, Blend Trees, support for non-VRChat projects, better integration with non-destructive workflows, integration with Modular Avatar, support for VRCAnimatorPlayAudio, and prettier support for VRCParameterDriver. The V1 API is now considered stable, and there will be no more breaking changes. (Read more)
☀️ Last month, I've released LetMeSee, a new tool that lets you see your content in VR, with the Unity Editor in Edit Mode. I had initially created this tool so that I could test VR toon shaders on a non-VRChat Unity project that uses the Universal Render Pipeline, but it works just as well for more traditional projects. (Read more)
⚙️ In Prefabulous Universal, Generate Twist Bones now creates VRC Constraints if the VRChat SDK is installed. In addition, Delete Polygons, Assign UV Tile, and Replace Textures should now run before VRCFury.
☀️ It is almost guaranteed that in the next few months, some complex clothing (i.e. kimonos and yukatas) will be sold with VRC Constraints directly in them. For this reason I've added a new component Convert VRC Constraints to Unity Constraints in order to provide options for users who wish to use avatars for other Unity applications, VTubing apps, or social VR platforms.
✨ Animation Viewer now supports CTRL-K, the Unity Search window. (Read more)
✨ Double Hip Tracker is receiving an update in V1.3.0 that changes the behaviour of the double trackers. It's an option, default ON.
In short, we will measure the distance that separates the two trackers. If that distance changes too much, one of the trackers probably flew off. We will now try to detect that, in addition to the usual method. (Read more)
Also, when both trackers are lost, the double tracker will now freeze in place, instead of flying off.
There are other changes described in the full update post.
✨ Vixen is receiving a small update in V1.3.0 so that you can change the boundary values of PhysBones, Contacts, and OSC. This is especially useful to change the range of activation for a PhysBone angle, the range of a PhysBone squish, or permit using two different ranges using one proximity Contact. That, along with a few fixes. (Read more)
✨ In IconGen V1.1.0, you can now export all of your decorated icons to PNG, so that you can redistribute decorated icons to other users. (Learn how)
If you're a Patreon supporter, you should download them now!
More updates are currently being worked on for FaceTra Shape Creator, Starmesh, and Vixen.
🧪 Project H-View: I have shared on GitHub a personal project of an ImGui.NET application capable of displaying the entire Expressions Menu into a compact layout, and makes extensive use of OSCQuery. (Read more) (GitHub)
It also has an early implementation of that ImGui application being rendered directly into a SteamVR Overlay! (Video)
I've always wanted to try controlling tools like VRCLens and VirtualLens2 as an OSC application, and also learn how to build an ImGui overlay using the OpenVR API, so I'm happy having finally taken the time to do this.
If you are a C# developer, you may be able to make use of that project (it's under the MIT License).
🧪 Project Nochat: UdonSharp has been mainstream for so long, we may have forgotten that C# files (.cs) were never intended to be executed directly inside VRChat worlds. The ability to write C#, and have it work on VRChat worlds, is a feature that is entirely community-driven.
Therefore, VRChat prefabs that use UdonSharp do not depend on any of VRChat's intellectual property at all.
I've taken the opportunity to try running UdonSharp prefabs in a plain Unity project with VR controls, without VRChat. It works! Zero lines of executable code from VRChat needed, as Udon is completely unnecessary. (Video)
The fact that UdonSharp prefabs do not require VRChat to run is relevant as I want to create and enable experiences outside the limitations of the VRChat platform, so porting content could make it possible to experience content we're already familiar with, but with different virtual environment capabilities. I hope I'll be able to share more with you on this subject. (Read more)
🧪 Project Myrddin: This project is very similar to Project Nochat, however, this one is an attempt to run the VRChat SDK without Udon, with the option to run ClientSim with VR controls. UdonSharp components would run as C#, without the Udon VM. This way, one may be able to use traditional IDE debugging features directly on UdonSharp content (breakpoints, instruction stepping, hot code reload without leaving Play mode, etc.).
VR controls in-editor would let you grab world pickups with actual VR controllers and interact with them, opening the possibility of iterating faster on VR content just like a normal Unity game developer would. (Video)
Your support makes it sustainable to continue the development of all these projects as a full-time occupation. Thank you so much! ⭐
You can now customize the bounds in OSC Floats, Contacts, and PhysBones.
This update introduces two changes in the behaviour of the double trackers:
Changes:
A new component, Convert VRC Constraints to Unity Constraints, has been added to Prefabulous for Platform Conversions.
It converts VRC Constraint components back to native Unity Constraints, to the extent applicable.
I am releasing Animator As Code V1.
Starting from V1.1.0, all Animator As Code V1 packages are leaving Alpha/Beta.
Animator As Code V1 can now be safely used in public projects that wish to do so.
Animator As Code V1 is designed to be installed even in projects that already have Animator As Code V0.
Animator As Code V0 will not be overwritten by Animator As Code V1. Both installations will act as separate, non-conflicting installs. Projects like FaceEmo will continue to function properly.
In fact Animator As Code V1 has already been extensively used in my own tools (Prefabulous, ComboGestureExpressions, Vixen).
There is no real need to migrate between V0 and V1, if V0 already provides all the functionality you need in your project.
If you choose to migrate, V0 and V1 are almost identical.
// When using Sub-State Machines, the Sub-State Machine will evaluate all transitions until
// it resolves a destination state within one single frame.
// This means it can traverse multiple transition conditions at once, no matter how nested
// the Sub-State Machine is.
//
// This is not doable with just states, so Sub-State Machines have a functional value beyond mere organization.
var a = layer.IntParameter("IntA");
var b = layer.IntParameter("IntB");
var rootSsm = layer.NewSubStateMachine("UsingNestedSubStateMachines");
for (var i = 0; i < 16; i++)
{
// A Sub-State Machine can have other Sub-State Machines created inside them.
// TransitionsFromEntry creates a transition between `subSsm` and the Entry node of the Sub-State Machine it belongs in.
// Exits creates a transition between `subSsm` and the Exit node.
var subSsm = rootSsm.NewSubStateMachine($"A = {i}");
subSsm.TransitionsFromEntry().When(a.IsEqualTo(i));
subSsm.Exits();
for (var j = 0; j < 16; j++)
{
var state = subSsm.NewState($"A = {i} and B = {j}");
state.TransitionsFromEntry().When(b.IsEqualTo(j));
state.Exits()
.When(a.IsNotEqualTo(i))
.Or()
.When(b.IsNotEqualTo(j));
}
}
// This creates a transition between the Sub-State Machine and itself.
// When that Sub-State Machine exits, it will re-enter itself.
rootSsm.Restarts();
// Animator As Code V1 no longer requires VRChat (compared to V0).
// VRChat-specific functions have been moved to extension methods.
// If you want to use VRChat Avatars functionality, add the `Animator As Code V1 - VRChat` package, and do the following:
//
// Add the following import, which contains extension methods:
using AnimatorAsCode.V1.VRC;
// To access VRChat parameters, use the following extension method:
var vrcAv3 = layer.Av3();
// To access VRChat assets, use the following extension method:
var vrcAssets = aac.VrcAssets();
layer.NewState("UsingVRChat")
.WithAnimation(vrcAssets.ProxyForGesture(AacAv3.Av3Gesture.HandOpen, false))
// VRChat State Behaviours are created through extension methods located in namespace `AnimatorAsCode.V1.VRC`
.TrackingAnimates(AacAv3.Av3TrackingElement.RightHand)
.Driving(driver => driver.Sets(layer.BoolParameter("A"), true))
.TransitionsFromEntry()
.When(vrcAv3.GestureRight.IsEqualTo(AacAv3.Av3Gesture.HandOpen));
layer.NewState("BlendTrees").WithAnimation(aac.NewBlendTree()
.FreeformDirectional2D(layer.FloatParameter("X"), layer.FloatParameter("Y"))
.WithAnimation(aac.NewClip("Center"), 0, 0)
.WithAnimation(aac.NewClip("Up"), Vector2.up)
.WithAnimation(aac.NewClip("Right"), 1, 0)
.WithAnimation(aac.NewClip("Down"), 0, -1)
.WithAnimation(aac.NewClip("Left"), -1, 0)
);
var one = layer.FloatParameter("One");
layer.OverrideValue(one, 1f);
layer.NewState("Direct BlendTree").WithAnimation(aac.NewBlendTree()
.Direct()
.WithAnimation(aac.NewClip("DrivenByA"), layer.FloatParameter("A"))
.WithAnimation(aac.NewClip("AlwaysOn"), one)
.WithAnimation(aac.NewBlendTree().Simple1D(layer.FloatParameter("B"))
// In Animator As Code, it is safe to declare points in a Simple1D blend tree in a different order (i.e. 0, 1, -1).
// (In native blend trees, it would not have been safe to do so)
.WithAnimation(aac.NewClip("Zero"), 0)
.WithAnimation(aac.NewClip("Positive"), 1)
.WithAnimation(aac.NewClip("Negative"), -1)
, one)
);
// When creating a new controller, the ContainerMode of the configuration usually needs to be
// set to either `Container.OnlyWhenPersistenceRequired` or `Container.Everything`.
//
// This is because it is not possible to add state behaviours to states unless the Animator Controller
// is already persisted in the asset database.
var controller = aac.NewAnimatorController();
var fx = controller.NewLayer();
AacFlState.Audio(AudioSource or string, ...)
function,
which takes a lambda expression as a parameter.AudioSource source = MyAudioSource(); // This can be a string instead.
AudioClip[] clips = MyAudioClips();
layer.NewState("UsingAudio")
.Audio(source, audio =>
{
// Get the PlayAudio object if there's a need to edit it directly.
VRCAnimatorPlayAudio vrcAnimatorPlayAudio = audio.PlayAudio;
// By default, a PlayAudio created through AAC does nothing (unlike a manually created behaviour)
// so you need to invoke anything that is relevant.
audio
.SelectsClip(VRC_AnimatorPlayAudio.Order.Random, clips)
.SetsLooping()
.RandomizesPitch(0.8f, 1.2f)
.RandomizesVolume(0.5f, 1f)
// "Replays" means Stop and Play.
// "StartsPlaying" means just Play.
// "StopsPlaying" means just Stop.
// To do neither Stop nor Play, don't invoke anything.
.StartsPlayingOnEnter()
.StopsPlayingOnExit();
});
AacFlState.Driving(...)
function, which takes a lambda expression as a parameter.AacFlState.Driving(...)
multiple times, it will create multiple VRCAvatarParameterDriver behaviours on the same state. One may be local, the other not.layer.NewState("UsingDrivers").Driving(driver => driver
.Copies(layer.FloatParameter("CopySource"), layer.FloatParameter("CopyDestination"))
.Sets(layer.FloatParameter("Set"), 2.3f)
.Increases(layer.FloatParameter("IncreaseBy2"), 2f)
.Decreases(layer.FloatParameter("DecreaseBy3"), 3f)
.Randomizes(layer.FloatParameter("Randomizes"), 0f, 100f)
.Randomizes(layer.BoolParameter("RandomizesBool"), 0.25f) // 25% chance of being true.
.Remaps(layer.FloatParameter("RemapsSource"), 0f, 2f, layer.FloatParameter("RemapsDestination"), 2f, 4f)
// This creates a second VRCAvatarParameterDriver in the same state.
).Driving(driver => driver
.Sets(layer.FloatParameter("SecondDriver"), 100f)
.Locally() // Only this second VRCAvatarParameterDriver is local.
);
Using the Animator As Code V1 - Modular Avatar functions, it can create Modular Avatar components.
var ctrl = aac.NewAnimatorController();
var fx = ctrl.NewLayer();
var toggleFloatParameter = fx.FloatParameter("MyToggle");
fx.NewState("BlendTree")
.WithAnimation(aac.NewBlendTree().Simple1D(toggleFloatParameter)
.WithAnimation(aac.NewClip().Toggling(myObject, false), 0)
.WithAnimation(aac.NewClip().Toggling(myObject, true), 1)
)
.WithWriteDefaultsSetTo(true);
// Create a new object in the scene. We will add Modular Avatar components inside it.
var modularAvatar = MaAc.Create(holder);
// By creating a Modular Avatar Merge Animator component,
// our animator controller will be added to the avatar's FX layer.
modularAvatar.NewMergeAnimator(ctrl, VRCAvatarDescriptor.AnimLayerType.FX);
// We use a float in the animator blend tree, but we declare it as a bool
// so that it takes 1 bit in the expression parameters.
// By default, it is saved and synced.
modularAvatar.NewBoolToFloatParameter(toggleFloatParameter).WithDefaultValue(true);
The documentation includes an example on how to integrate with Non-Destructive Modular Framework using a plugin.
Animator As Code V1.1.0 contains breaking changes compared to Animator As Code (Alpha) V1.0.99xx.
This list does not contain the breaking changes between V0 and V1, please see the migration guide for this.
Compared to 1.0.99xx:
Other notes:
These are likely going to be the last breaking change in V1's lifetime.
VRC_SDK_VRCSDK3
scripting define around.
IEditorOnly
class only if the VRChat Avatars SDK is installed.