text
stringlengths 0
1.11k
|
|---|
Connect your headset using the Link cable and perform the following in the Oculus App:
|
Click Devices and ensure your headset is showing up.
|
Select the connected device and click Device Setup in the right menu.
|
Device Setup*
|
Click Link Cable > Connect Your Headset > Continue.
|
Select Test Connection.
|
Test Connection
|
Ensure you get a Compatible connection message, after the test is complete.
|
Compatible connection
|
If this test returns an Incompatible connection message, try a different PC cable.
|
Step 5. Check bandwidth
|
For Color Passthrough, the USB connection should provide an effective bandwidth of at least 2 Gbps. You can always measure the connection speed by using the USB speed tester built into the Oculus App:
|
Go to Devices.
|
Select the connected device.
|
Click USB Test and then Test Connection.
|
If the connection speed is low, try using a different cable and USB adapter.
|
Step 6. Check Link connection on headset
|
To ensure your headset connects to Link properly, follow these steps on your headset:
|
Go to Settings > System.
|
Next to Quest Link, toggle on access.
|
Check connection on device
|
Basic Link usage for app development
|
As a developer, you can use Link in two modes:
|
Directly run the scene in the Unity Editor by hitting Play(►).
|
Run your project as a standalone PC app.
|
Regardless of the mode, the app collects full tracking data from your headset.
|
Running your app on PC over Link for most cases is similar to running on the headset. While running the app on PC via Link (for “Play in Editor” and standalone modes) you, as a user, see a 3D screen of the app inside your viewport (on headset) as well as the normal screen of the app (on PC screen).
|
Note: Make sure you enable a Plug-in Provider (E.g. Oculus XR Plug-in) under Edit > Project Settings > XR Plug-in Management > Windows, Linux and Mac settings
|
Link Log gathering from Unity Editor and OculusLogGather tool
|
If you need to collect Link logs, you can do so through Unity Editor and the Oculus App.
|
In Unity Editor, go to Window > General > Console.
|
Unity Console
|
Click Play in the Editor while your headset is in PC Link.
|
Click on the tri-dot (three dots) icon in the console’s top right and select Open Editor Log.
|
Open Editor Log
|
Locate OculusLogGather.exe on your hard disk. Location depends on where you have installed the Oculus App. Typically, it should be in your C:\Program Files\Oculus\Support\oculus-diagnostics folder.
|
Run the OculusLogGather.exe tool.Key Terms (Glossary)
|
Unity
|
All-In-One VR
|
PC VR
|
Quest
|
The platform for which this article is written (Unity) does not match your preferred platform (Nativa).
|
Haz clic aquí para ver la página del índice de la documentación para tu plataforma de preferencia.
|
You should familiarize yourself with these common terms and concepts used in the Meta Quest domain.
|
Term
|
Definition
|
App Lab
|
A way for Meta Quest 3rd party developers to distribute apps directly to consumers safely and securely, via direct links or platforms like SideQuest, without requiring store approval and without sideloading. App Lab apps are not promoted within the Quest Store and other discovery surfaces.
|
Augmented Reality (AR)
|
Integration of digital information with the user’s environment in real time. Unlike virtual reality (VR), which creates a totally artificial environment, AR users experience a real-world environment with generated information overlaid on top of it.
|
Avatars
|
Using the Meta Avatars SDK, developers can provide user-created Meta Avatars to increase social presence and enhance VR immersion. Avatars are provided as modularized full-bodied torsos that allow users the flexibility to create their own unique identity persistent across the Meta ecosystem. Leveraging advanced body tracking, realistic Avatar poses can be extrapolated from the Meta headset and Touch controllers to provide users a sense of self in a VR world.
|
Hand tracking
|
Hand tracking enables the use of hands as an input method for the Meta Quest headsets. When using hands as input modality, hand tracking delivers a new sense of presence, enhances social engagement, and delivers more natural interactions with fully tracked hands and articulated fingers.
|
Meta Quest link
|
Connects Quest hardware to the PC via USB or Wifi. Useful for both consumer and Development workflows.
|
Meta XR Audio SDK
|
Contains everything needed to create audio experiences for XR applications that properly localize sounds in 3D space and create a sense of space in the virtual environment, allowing users to be fully immersed in the auditory scene. The spatial audio rendering and room acoustics functionality included in the SDK improves and expands upon the legacy Oculus Spatializer, and this feature set will expand in future releases.
|
Meta XR All-in-One SDK
|
An all-in-one source for core features, components, scripts, and plugins to ease the app development process. It comes as a package that contains multiple SDKs.
|
Meta XR Core SDK
|
A package that contains essentials for developing in VR with Meta XR, including controllers, the boundary system, splash screens, mixed reality APIs such as Passthrough, Scene, and Spatial Anchors.
|
Meta XR Interaction SDK
|
A library of modular, composable components that allows developers to implement a range of robust, standardized interactions (including grab, poke, raycast, and more) for controllers and hands. Interaction SDK also includes tooling to help developers build their own hand poses.
|
Meta Movement SDK
|
Supports the Avatar SDK by extending capabilities to 3P developers that don’t have embodied avatars or choose to not adopt the Meta’s Avatar capabilities.
|
Meta XR Voice SDK
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.