top of page

Gregory Osborne

- Interactive Audio Specialist
- XR Curriculum Developer

Headshot_Osborne_April2019_2.jpg

Interactive Audio

After studying Video Game Music Implementation at Berklee College of Music, Gregory discovered the power of VR to convert movements of the body into inputs that could be used to trigger musical events.

Rave Gazebo

Rave Gazebo is a dance-interactive virtual reality album that uses your dancing to remix a song in real time. The main input is simply gesturing in one of six directions, which switches between pre-composed layers of music. Each gesture interacts with the environment, sending physics objects into the scene and generally creating chaos. Learn more in our special Rave Gazebo Page!

Loop Jam

Loop Jam won 2nd place in the Meta Quest Presence Platform Hackathon 2024 Utility & Design Experiences
category.

It is a multiplayer looper pedal that uses hand tracking and passthrough. It solves the problem of audio latency when collaborating musically by recording audio to a repeating loop and saving the location for later playback.

Dance Diffusion

Dance Diffusion is an experience developed for the MIT StageHack 2024 where two dancers control the audio playback in real time using their distance from the two Kinects in front of them. The Kinect data was sent over a local network through Touch Designer and SuperCollider into Ableton to control dials that were mapped to digital signal processing parameters like pitch and filters, or to trigger playback of different MIDI clips for the drum set.

 

In addition, a depth camera was placed in front of the dancers and this data was used to influence live AI-generated visuals with additional prompt input from a set of sliders controlled by another performer off stage.

Composer Demo Reel

Here is a video compilation of the projects Gregory worked on in interactive audio from 2019, before pivoting to focus his efforts on virtual reality.

Ambisonic Tutorial

While interning at the Brookline Interactive Group, Gregory created this tutorial on how to quickly and cheaply create ambisonic files using Reaper and the Ambisonic Toolkit

Simulation Series
Interview

Gregory was given the opportunity to talk about his work on interactive music in virtual reality with the Simulation Series in 2019 while he was still at Berklee. He has been working ever since to bring his visions to reality.

XR Instructor

Starting in the middle of the Covid-19 lockdown, Gregory was hired to work on developing curriculum for the online boot camp at XRTerra.

 

He has used his own VR development journey to create hands-on courses with a heavy emphasis on live troubleshooting and support.

XR Device Simulator 2.3.0 And Later
22:05

XR Device Simulator 2.3.0 And Later

This video explains how to use the XR Device Simulator to control a VR rig from the Unity Editor using just the keyboard and mouse. Unity updated the XR Device Simulator in the XR Interaction Toolkit package version 2.3.0, so if you're using an older version of the package look at our other video. XRTerra Links: Programs and Courses: http://www.xrterra.com/programs Newsletter Signup: https://www.xrterra.com/mailing-list-signup/ XR Device Simulator 2.2.0 And Earlier: https://youtu.be/Lo6_1CnycFE Your First VR Scene with the XR Interaction Toolkit in Unity: https://youtu.be/nlRzw2lCIkk VR Locomotion with the XR Interaction Toolkit in Unity: https://youtu.be/sQFdjAV-dBg 00:00 Intro 00:23 The XR Device Simulator 01:49 Updating XRITK package to 2.3.0 02:27 Importing XR Device Simulator 03:46 Setting up a VR Scene 04:32 Adding XR Device Simulator Prefab to our scene 05:00 Looking around 05:35 Rotating only the camera 06:10 Simulating walking around 07:09 Locking the cursor 08:04 Toggling between rotation and translation 08:42 Scrolling the mouse 09:34 Selecting the controllers with toggle button 10:56 Adding locomotion to our scene 11:29 Using the controller joysticks for locomotion 12:39 Selecting the controllers with Left Shift and Spacebar 13:43 Controlling both controllers simultaneously 14:46 Controller buttons 15:19 Creating a grabbable cube 15:46 Grabbing with the XR Device Simulator 16:48 Dangers of clicking out of the Game window 17:54 Setting up Activate UnityEvent 18:24 Testing cube activation 18:50 Simulating other controller buttons 19:14 Other controller's joysticks 19:46 Adding snap turn controls 19:59 Testing out alternate controller joystick 20:56 XR Device Simulator as a debugging tool 21:51 Outro Instructor: Gregory Osborne
Custom Input Actions With The XR Interaction Toolkit using Unity's New Input System Package
21:32

Custom Input Actions With The XR Interaction Toolkit using Unity's New Input System Package

This video shows you how to detect your own inputs, such as buttons or joystick controls, by adding your own custom input action to the XRI Default Input Action asset and subscribing to their events through a C# script. XRTerra Links: Programs and Courses: http://www.xrterra.com/programs Newsletter Signup: https://www.xrterra.com/mailing-list-signup/ Your First VR Scene with the XR Interaction Toolkit in Unity: https://youtu.be/nlRzw2lCIkk C# Events in Unity: https://youtu.be/rhRGBTYONgY XR Device Simulator 2.2.0 And Earlier: https://youtu.be/Lo6_1CnycFE 00:00 Intro and prerequisites 00:52 VR Scene setup 01:45 Explaining our exercise 02:14 Opening the XRI Default Input Action Asset 02:51 Creating a new Input Action 03:12 Binding the button action path 03:56 Adding an interaction condition 04:45 Creating a script 05:15 using UnityEngine.InputSystem namespace 05:35 Referencing the Input Action Reference 05:50 Explaining Button action events 06:20 Subscribing to the button's events 06:58 Creating the subscribed function 07:13 Receiving InputAction.CallbackContext parameters 08:12 Toggling a Mesh Renderer component 08:40 Creating a cube to toggle 09:13 Assigning button Input Action reference in Inspector 09:46 Finding our Input Action asset in Project window 10:02 Testing out our button action 10:49 Lets use the thumbstick to move our cube around 11:13 Creating the custom thumbstick input action 11:56 Changing the Action Type 12:13 Setting the Control Type to Vector2 12:49 Binding the thumbstick action path 13:49 Declaring a new Input Action Reference 13:59 Subscribing to the thumbstick action 14:40 Extracting information from the CallbackContext type 15:07 context.ReadValue 15:39 Putting extracted vector into the Console 16:02 Assigning thumbstick Input Action reference in Inspector 16:22 Debugging thumbstick inputs with the headset 17:13 Testing out thumbstick output in the Console 17:57 We already have a reference to the cube 18:14 Describing what we're about to do 18:35 Converting Vector2 to Vector3 19:22 Moving the cube by our new vector 19:59 Testing out a very fast cube 20:28 We'd probably multiply the vector times Time.deltaTime 20:39 Summary Instructor: Gregory Osborne
Physics Joints in Unity
30:08

Physics Joints in Unity

This video explains several Joint components in Unity and how they can be used to either constrain or power physics based objects in your scene. We cover Fixed Joints, Spring Joints, Hinge Joints, Character Joints, and Configurable Joints. We also mention how you can make a Ragdoll on a character imported from Mixamo using Unity's Ragdoll creator. XRTerra Links: Programs and Courses: http://www.xrterra.com/programs Newsletter Signup: https://www.xrterra.com/mailing-list-signup/ Colliders and Rigidbodies: https://youtu.be/K4JwfpXJFik 00:00 Introducing Joint components 01:14 Fixed Joint scene setup 02:02 Fixed joint connected to world space 02:30 Connecting the Fixed Joint 03:55 Break force and break torque 05:01 Disintegrating objects 05:44 Introducing the Spring Joint 06:07 Spring Joint orange box gizmo 07:07 Connecting the spring to the pull cube 08:05 Spring Joint anchor 09:34 Spring force 09:51 Enable collision 10:16 Damper 10:47 Min and Max Distance 11:09 Introducing the Hinge Joint 11:23 Hinge Joint scene setup 12:01 Edit Angular Limits button 12:43 Testing Doggy Door hinge 12:54 Moving the hinge anchor 13:22 Changing the axis of rotation 13:52 Setting up a door hinge 14:44 Hinge limits 16:04 Motors to add force 17:21 Character Joint setup 18:44 Testing the Character Joint 19:14 Explaining the Character Joint gizmo 20:40 Experimenting with the gizmo 22:00 Creating Ragdolls 22:39 Downloading a character from Mixamo 23:33 Turning our character into a ragdoll 25:29 Introducing the Configurable Joint 26:55 Constraining movement to a linear axis 27:50 Linear movement limits 29:18 Final thoughts on Configurable Joint 29:57 Outro Instructor: Gregory Osborne
Sourcing External Textures And Models for Unity
23:13

Sourcing External Textures And Models for Unity

This video explains the different kinds of texture maps and how you can use ambientcg to find texture packs to help build realistic materials. It also mentions poly.pizza as a resource for free low poly models. Lastly, we cover how to use the Unity Asset Store to import models, animations, environments, and other assets, including how to solve pink shader errors on imported assets. XRTerra Links: Programs and Courses: http://www.xrterra.com/programs Newsletter Signup: https://www.xrterra.com/mailing-list-signup/ Shaders, Materials, and Textures in Unity: https://youtu.be/G0KMz65cmr4 00:00 Intro 00:20 Explaining the URP Lit Shader properties 01:56 Base Map 02:22 Metallic Map 02:59 Occlusion Map 03:44 Normal Map 04:41 Emissive Map 04:55 Texture Atlas 05:11 Making a Quad 05:29 https://ambientcg.com/ 06:45 Extracting zip into Unity project 07:36 Creating a brick wall material 08:00 Assigning textures to material 08:43 NormalGL vs NormalDX 08:56 Setting normal map texture type 10:50 Height map 11:18 Occlusion map 12:15 https://poly.pizza/ 13:20 Creative Commons attribution 13:57 Putting the model into our scene 15:04 Unity Asset Store 15:54 Searching for a free car asset 16:26 Selecting a car asset 16:46 The Package Manager 17:17 My Assets in the Package Manager 17:37 Importing the car into our project 18:22 Package demo scene 19:03 Fixing Render Pipeline issues (pink!) 21:56 Summary of the Unity Asset Store 22:45 Review and outro Instructor: Gregory Osborne
Animator Controller in Unity
20:17

Animator Controller in Unity

This video explains how to us the Animator Controller asset in Unity to manage transitions between animation clips on your Game Objects. We explain how to create transitions between animation states, how to set conditions on those transitions, and how to set up parameters to trigger those conditions. We also discuss how animator layers can animate different parts of the same model at the same time. XRTerra Links: Programs and Courses: http://www.xrterra.com/programs Newsletter Signup: https://www.xrterra.com/mailing-list-signup/ Creating an Animation Clip: https://youtu.be/YNidyX5KOe0 UI in Unity: https://youtu.be/w9Z24uRGqPM 00:00 Intro 00:21 Scene cylinder setup 00:58 Creating two animation clips 03:02 Opening the Animator window 04:02 Animation states as containers 04:33 Animation state speed 05:19 Animation state Cycle Offset 05:55 Animation state parameters 06:30 Navigating the Animator window 06:51 Default Animation state 07:25 Selecting animated object in Hierarchy 07:46 Setting the default state 08:13 Transitioning between states 09:14 Transitioning to itself 09:40 Deleting a transition 09:49 Testing out looping animations 10:09 Transition settings 10:56 Animator controller layers 12:17 Creating a new layer 12:44 Layer weight 13:29 Triggering an animation 14:36 Transition conditions 14:54 Has Exit Time 15:24 Animator parameters 15:58 Setting conditions to use parameters 16:11 Testing out trigger parameter 16:26 Unchecking Has Exit Time 17:27 Making a button 18:11 SetTrigger() from UnityEvents 19:15 Testing out button trigger 19:59 Outro Instructor: Gregory Osborne
AI Navigation in Unity
55:19

AI Navigation in Unity

In this video we cover how to get non player characters to move around your environment using a Navigation Mesh baked into your environment and the Nav Mesh Agent component. We discuss defining terrain cost, creating Off Mesh Links, and how to use Nav Mesh Obstacles. In addition, we discuss how to set the destination of your agent through code, how to set up a waypoint patrol system, and how to detect and chase a player before losing interest and returning back to the waypoints. XRTerra Links: Programs and Courses: http://www.xrterra.com/programs Newsletter Signup: https://www.xrterra.com/mailing-list-signup/ 00:00 Intro 00:57 Installing AI Navigation Package 01:47 Scene Setup 03:40 Agent Types 05:35 Nav Mesh Surface 06:20 Object Collection 06:52 Baking the Nav Mesh and Visualizing 07:25 Baking for different Agent Types 08:35 Creating Nav Mesh Agent 09:57 Create Follow Target Script 10:49 using UnityEngine.AI; 11:37 NavMeshAgent.SetDestination() 12:23 Target sphere 13:07 Testing Follow Target 13:56 Path Visualization Gizmo 14:26 Agent Speed 15:38 Agent Acceleration 16:19 Agent Angular Speed 17:08 Agent Stopping Distance 19:05 Navigation Areas 20:20 Nav Mesh Modifier 21:31 Creating Navigable Layer 22:50 Testing out Difficult Terrain Area 23:35 Off Mesh Link 26:43 Generated Drop Height Links 27:47 Testing Drop Height Links 28:15 Generated Jump Distance Links 30:07 Testing Jump Distance Links 30:37 Nav Mesh Obstacles 32:46 Carving out of the Nav Mesh 35:09 Patrol Script 40:36 Waypoint Setup 41:58 Testing Waypoints 43:11 Explaining Target Trigger Detection 45:02 Player Detector Script 46:22 Testing Player Detection 47:00 Chasing the Player 50:34 Losing the Player 53:48 Testing Losing the Player 54:52 Outro Instructor: Gregory Osborne

VR Developer Foundations

This is an 8-class course that teaches you the foundations of VR development in the 3D Engine Unity, introducing the XR Interaction Toolkit, Collision and Trigger Detection, Animations, with an emphasis on learning C# programming fundamentals

AR Playground: CoSpaces

Many school systems in the US do not have access to computers that can run more powerful programs such as Blender and Unity, and instead rely on Chromebooks to give their students internet access. In order to teach these students XR development skills, Gregory created a course that can be taught using only web-based applications like CoSpaces and Tinkercad.

AR Playground: Reality Composer

The iPad is an incredibly powerful device with several key pieces of hardware that make it a useful tool for the 21st century. This course has students take advantage of the camera, microphone, and internet access of their iPads to create their own interactive Augmented Reality experience in Apple's Reality Composer app.

bottom of page