Skip to main content

🔺 HVR Basis Face Tracking (Update)

🔺 Basis

HVR Basis Face Tracking has been migrated to the new Basis networking system.

Instructions for previous Face Tracking avatar users:

  • If you have previously used Face Tracking in Basis, do the following:

  • Remove all components that were part of the previous Face Tracking instructions. That includes:

    • Remove the Feature Networking component
    • Remove the HVR Avatar Comms component
    • Remove the Eye Tracking Bone Actuation component
    • Remove the Blendshape Actuation component
    • Remove the OSC Acquisition component
    • Remove the Acquisition Service component
  • To install the new Face Tracking, do the following:

    • Create a new GameObject inside your avatar. Give it a name of your choice, like FaceTracking.
    • Add the Automatic Face Tracking component.
    • In the inspector of the Automatic Face Tracking component, press the "Create VRCFaceTracking JSON file" button.
    • You will need to do this even if you have used previous versions of Face Tracking in Basis

The new face tracking has been modified to work with the Steam version of VRCFaceTracking (it now has OSCQuery), but it should also work with old versions of VRCFaceTracking that were not installed from Steam.

Backend changes

Migrate face tracking to newest Basis networking

  • Migrate to newer Basis avatar networking systems:
    • Remove GUID-based networking identifiers, which was a remnant of the previous networking system where BasisAvatarMonoBehaviour did not exist and all packets had to be routed manually to their destination components.
    • Migrate networking to the new BasisAvatarMonoBehaviour networking system, which handles packet routing on its own by transmitting packets based on the component rather than based on the avatar.
  • Add Automatic Face Tracking component:
    • The Automatic Face Tracking component is now the entry point to adding HVR face tracking capabilities to the avatar.
    • This component automatically detects all SkinnedMeshRenderer components on the avatar that have ARKit or Unified Expressions blendshapes on it.
    • By default, this component uses the basic face tracking definition files that are bundled with the application, but this can be either overridden by the user or supplemented.
      • Due to this, the face tracking assets are now addressables.
    • This components automatically starts an OSC service on the wearer.
  • Improve the handling of face tracking data:
    • Add Address Overrides to the actuation definition file, so that we can define default values for the eyelids.
    • Work towards the mutualization of the interpolation of addresses between Eye Tracking and Face Tracking.
    • Actuators now introspect which blendshapes actually exist on the target renderers, and will no longer try to network any actuator that has no blendshape associated with it.
    • Add more debugging in the blendshape actuation definition file editor.
    • Fix TongueRoll actuator was targeting the wrong blendshape.
  • Move the package to dev.hai-vr.basis.comms/

Change VRCFaceTracking setup process

  • In the Automatic Face Tracking component, the user now presses a button to create the face tracking JSON file used by VRCFaceTracking.
  • We want to ensure we overwrite any previous version of face tracking that may have already been installed by previous users of the previous system, so we're changing the dummy avatar ID to a different one. That way, we're sure this is really the file that will be loaded by VRCFaceTracking.

Fix always update local eye angles

  • Update local eye angles, even if no address has changed.
  • This should fix an issue where the eyes were always looking forward.

Add a fake OSCQuery service to cope with the Steam version of VRCFaceTracking

  • The Steam Version of VRCFaceTracking 5.3.2.0 does not accept /avatar/change events, it relies only on OSCQuery for initialization.
  • We now impersonate the OSCQuery service by doing the following:
    • A) Opens an HTTP service on a random port:
      • When queried for any URL that ends with /avatar, it replies with a pre-determined message (check out response-avtr.json).
      • When queried for any other URL, it replies with another pre-determined message (check out response.json).
    • B) Advertises that port as a service on mDNS as _oscjson._tcp with the instance name VRChat-Client-XXXXXX where XXXXXX is a random number between 100000 and 999999.
    • C) Queries for _oscjson._tcp once when the service starts.
    • A and B are sufficient for VRCFaceTracking to detect our program if the OsushiQuery service is already running when VRCFaceTracking starts.
    • C is needed to handle the case where VRCFaceTracking is already running before our program starts the OsushiQuery service.
    • The HTTP service completely violates the OSCQuery protocol and is not intended to be read by any other program other than VRCFaceTracking.
  • The JSON file in the users' directory is still needed to function properly.
  • This also means that the application requests a pre-determined list of parameters from VRCFaceTracking, as opposed to the JSON file which could previously be updated separately.
  • Add a MeaMod.DNS package as instructed by dooly.

Update acquisition server to accept VRCFaceTracking being initialized from OSCQuery

  • Since VRCFaceTracking is now being initialized from OSCQuery, we can't use the address that don't start with "/avatar/parameters" like we used to.
  • Rectify the paths in the JSON file to match the OSCQuery paths.
  • Strip that from the messages we receive to restore previous behaviour.