Learn the Art & Science of Creating Virtual Reality Experiences
I am a professional VR developer, with over 20 years experience in games and interactive media. My passion for VR began back in 2013, when I acquired my first VR headset: an Oculus DK1. I remember loading up the Tuscany demo, and being completely awestruck!
Since then, I have been working primarily in VR. A decade on, and my passion for VR has not abated.
Indeed, I’m keen to help others join in on the fun!
This has led me to creating this channel, which features easy-to-understand tutorials on XR development (using the Unity game engine). Whether you're a beginner or a seasoned developer, my goal is to help you understand the fundamentals of XR development, and show you how to create amazing experiences for your users.
Пікірлер
Thank you for this, and your other videos in this series. VERY helpful and well put-together.
Ah crap. The passthrough has now stopped working. I'm getting black background. Scanning the room or adding the feathered planes broke something. EDIT - This happened because I somehow switched to a Windows build at the end of video one, when LW told us what to do to preview the build in the headset. I fixed it by just switching the platform back to Android.
I tried 3 or 4 other tutorials to achieve this yesterday and none of them worked for me. Errors and problems at every step. THIS however, is the one. Just got to the end of this video and it's working great. This guy knows his stuff, and goes into great detail and takes care to make sure the project is set up just right. Looking forward to working through the next vids.
Thanks for the video. I'm conducting a virtual reality (VR) experiment with Unity 3D. In one scene, I need to move a person from a bed, lifting them with both hands from behind and carrying them to a nearby chair. What do you think is the best approach for handling this scene?
@ludicworlds I have been following you on here for a while. Your tutorials are great. keep it going!! Im trying to have a virtual plane have a persistent anchor that will align my XR world to the fixed point of a real room but the XR always is somewhere different. Is there something I'm over looking?
thx for this video, crystal clear, but I'm sad you did not explained what "schemes" are..
Thank you for the feedback. I think I may need to update this tutorial series soon, as so much has changed in XR interaction Toolkit Version 3.
Why do I not have a settings folder under my assets?
The 'Settings' folder in 'Assets' typically contains various project-specific settings files. In this project, these files define Graphics Settings (i.e., configuring the Render Pipeline). However, it could also contain Audio Settings, Build Settings, etc.
@@LudicWorlds Hi, thank you for your reply. Your video tutorials are so helpful! I still am a little confused because I do not have this settings folder within my assets like you show in your video, and because of that I am unable to apply the URP balanced renderer asset to the scriptable render pipeline setting like you show at 10:17. Let me know if you know why this could be happening and thank you again!
Hi, I see that you disable post processing completely, so there is no way to use post processing bloom effect on a quest 2 project?
The Quest 2 uses a tile-based rendering architecture, which makes full-screen render passes, such as those required for post-processing effects like "Bloom", quite expensive. While you can use post-processing effects if they are crucial for your app, it's important to be mindful of the potential performance impact.
@@LudicWorlds ok thanks, there is any difference with quest 3?
The Quest 3's GPU also uses tile-based rendering, so you still need to keep that in mind.
Thank you for free asset!
Hello, many thanks for the great tutorials and this video. After updating to Unity 6 (6000.0.7) with new AR Foundation package the outcome of the tutorial and also this sample project, the content in mixed reality freeze after some seconds (it could be 5 or 30) while with latest Unity LTS no. Does it happened to you as well? Thanks in advance!
After downgrading to 6000.0.5 it works again in case someone encounter the same issue
Strange, I upgraded to Unity 6 (6000.0.7) but wasn't able to replicate this. Was it happening in a particular scene?
Installed perfect. I did not have to do anything extra newtonsoftish... Running 6000.0.0.5f with HUB 3.8
That's great! Looks like there was a problem with the project's manifest file. Unity has recently pushed a fix to the GitHub repo: "Update manifest.json to include new Newtonsoft package dependency". github.com/Unity-Technologies/arfoundation-samples/commit/23b6685bfa42523849b4c4aa45bdaf337fdb1137
are unity still trying to take devs money?
i thinks yes
You explain so good. Love your videos.
Thank you, I really love your tutorials. However, in the new version of Inworld now, there is no longer the Inworld Player in the InworldController (I think changes are made to the PlayerController). Would you know how to reconfigure Inworld PlayerControllerRPM for the XR Origin Camera for the newer version? Thanks a million!
Thank you, I am glad you are enjoying my videos! :) It's been quite some time since I've looked at Inworld, so I don't know how to fix this in the current version. However, I've noticed that Inworld are now releasing 'Getting Started' tutorials on their KZread channel. Maybe this Unity series can help? kzread.info/dash/bejne/eoahtrN_YKa9Z7Q.html
How can we make Mixed Reality toggleable? In case we want players to be able to go from MR to VR? What components have to be disabled?
Best clear explanation from the scratch
could i not build, just play in unity by simulator?
THANKS BRO!!
Excellent tutorial as always. Wish list: Spatial anchor shared through cloud to enable FPS in Seethrough by aligning room scan orientation in two or more...
Thank you so much for this amazing video. It is really helpful! I have question, when will you cover a video on Target Priority mode in interactors?
I’ve just started my Master’s project which is completely mixed reality focused on quest 3. These tutorials are an absolute blessing! Thank you so much. Can’t wait for the next ones. Do you know if most of these OpenXR concepts are also applicable to the Meta all in one SDK?
Thank you, so glad they have been useful in your project! :) The concepts are indeed very similar, they have just been implemented in a different way. I would say that the 'XR Interaction Toolkit' makes it easier to port your project to different platforms. However, the 'Meta SDK' will give you the tightest integration with the Quest 3, and access to the latest features (so I would recommend it if you want to develop solely for the Quest).
hey do yo have a source code for this? im trying to make something very similar and im kinda stuck lol. please and thanks!
Thx
I must say that you are really good at explaining. i have tried to understand XR interaction toolkit for some time now. This is the firsttime that i have found a tutorial that really explain it good. Need to check your other videos.
Thanx for the tutorial But when I open the app there are no any planes visible. I am using AR feathered plane prefab. And I have already setup my room space too.
Hi, is there any method to import our developed characters ? Or maybe make the characters as a robot.
What is the difference between Hidden and All? Idk if its just me but I don't see much of a difference?
Very slow. generating images forever for me, and download works just for money.
Would love to see more videos about XRI. Also about the latest version 3.x
Thanks for the suggestion! My recent 'Mixed Reality' tutorial series also uses XRI, and I am currently extending this with some tutorials that use XRI 3. I plan, however, to cover XRI 3 more generally in the near future (particularly since the API/Components have changed significantly).
Does this work with oculus2?
Yes, Passthrough certainly works on the Oculus 2. However, it will be monochrome and low resolution.
Your tutorials are amazing. You are a great teacher. I subscribe to your Patreon to give some motivation to continue 🤭
Thank you for your kind words and support! :) I am working on a new Mixed Reality tutorial right now (using some of the new features in the OpenXR: Meta 2 package).
@@LudicWorlds Can't wait to see that! Mixed Reality opens so many possibilities. It's new and exciting. For the moment, I am quite disappointed with the number of "productivity apps" available on the Quest 3. I hope (and think) that many tools will come in the not too distant future.
Best tutorial series I've seen in a long time, thanks a lot ! Hoping to see more of those, hand tracking would be nice !
Thanks for this tutorial series, it's been a great starting point to learn about MR dev.
I'm glad you found it useful! :)
I enjoyed it.. I hope you can use a virtual keyboard in the next lecture. please.
Professional teaching skills, thank you!
How to place object at particular position just like we do through mobile AR
Take a look at Part 5: Raycasts & Anchors: kzread.info/dash/bejne/i4yssbGoZcLgZsY.html In that tutorial, we detect where a controller ray hits a collider, and then place an object at that specific position.
Thank you for the video! It might be a stupid question, but do I understand correctly that developing for mobile AR and AR using Quest is fundamentally different? Like the features from ARCore would be available on Quest at all right?
By using the 'Unity OpenXR: Meta' package we are also leveraging the 'AR Foundation' API. The API will remain the same, regardless of platform, however, the features available to you on the Quest may be limited compared to mobile. Check out 'Features' here: docs.unity3d.com/Packages/[email protected]/manual/index.html
Best Tutorial
Teach us how to give arms/avatar to the body based on a prefab
Show us how to make the zombie walk around and try to attack you
also is there anyway to make the planes visible in unity from your scan n
⚠ IMPORTANT UPDATE ⚠'Unity OpenXR: Meta' Version 2 has now been official released! Please install 'Unity 6 Preview' (not 'Unity 6 Beta' as instructed in the video). Also, when it comes to installing the 'Unity OpenXR: Meta' package 'by name', use the following Name and Version number in the relevant Text Fields: ► Name: com.unity.xr.meta-openxr ► Version: 2.0.0
I like your channel, very good content, however, I avoid using anything which needs internet (API) for running the game. I wish we could upload everything offline.
Thanks! :) I am actually working on some Meta Quest tutorials using 'Unity Sentis' , which includes using a local 'Whisper Tiny' AI model for speech recognition (no internet API required!). Here's a little demo: kzread.infovp7xtKEkEk8 There's a whole bunch of AI models ready-made for 'Unity Sentis' on Hugging Face: huggingface.co/unity All designed to run locally on your machine!
Edit isnt whowing up
Thank you for the great tutorial. I have a question. I followed along line by line from your script, but after I save and close, I don't see any "Toggle Planes Action" field. Instead I get this error message from unity: "error CS0246: The type or namespace name 'InputActionReference' could not be found (are you missing a using directive or an assembly reference?)". I also tried making the variable public but that did'n work as well. Would you happen to know what did I do wrong?
Check the 'using' statements at the top of file. Make sure they include the following using directive: using UnityEngine.InputSystem;
Same problem here. Checked my using statements, no problem there. I restarted the project, had to open it in safe mode, closed it again, opened it again and my scene was completly empty. It only made it worse, bc now most of the work is gone
I love your vids! Could you plz explain why there are so many different MR setups? For example this with AR foundation, Meta's MR Utility kit, Building blocks ect. It's seriously confusing the heck out of me and not sure which to use, which no longer works. Feels like this is going in infinite circles😵💫
Thank you very much! :) The diversity in MR setups is often a source of frustration for me also (sometimes tech changes before I complete a tutorial on it!). I guess it's a consequence of what is still such a rapidly evolving field, hopefully it will stabilize eventually. If you are going to stick solely to the Meta Quest, it's probably best to go with Meta's official XR SDK. However, if you plan to port your app to other XR platforms, the 'XR Interaction Toolkit' (which I use in these tutorials) should make the conversion process a lot easier.
Nice to figure out the classifications.hasflag thing... THX🙂
You can also do something like this: if ((plane.classifications & PlaneClassifications.Table) != 0) { // The plane is classified as a table } The above is apparently slightly more performant than 'classifications.HasFlag() ' . I went for the HasFlag() option for the sake of clarity.
Thank you ^^~~
It's really helpful!
waiting ..next season.
I should have a new video for you in the next couple of days - about the new 'OpenXR: Meta' V2 pre-release.
Dude you seriously have some of the best tutorials, very impressive! Plz continue the MR series like giving the zombie object collision, damage collision, occlusion, movement, custom animations 🙏