Latest news about Bitcoin and all cryptocurrencies. Your daily crypto news habit.
As a software developer, I spend my time building websites and web apps, but for a long time, Iâve also had a side interest in virtual reality. Iâve named my Oculus Go âBettyâ and get giddy talking about experiencing gondola rides through Venice, hopping on breathtaking roller coasters, and traveling through veins in the human body. Since I use React, mainly, I was excited to learn I could develop virtual reality experiences with a library I already know and love.
To try my hand at React VR, I recently created a virtual reality application called Find Your Zen, which allows the user to select an immersive meditation environment, each of which comes with its own mantra inspired by the very excellent show âThe Good Place.â In May 2018, shortly after I built my app, Facebook released a revamped and rebranded version of React VR called React 360 with multiple changes and significant improvements.
As I ported my application over to React 360, I took note of some important differences between React VR and React 360. I wrote the following article for developers who possess a working knowledge of React. If youâre unfamiliar with the library, I recommend starting here first.
If you want an introduction to React VR (as well as Recompose, whose utility functions help manage my applicationâs state), you can find that here and here.
Viewing the finished demo code
$ git clone https://github.com/lilybarrett/find-your-zen.git$ cd find-your-zen$ npm i$ npm start
File structure
The basic file structure for React VR was as follows:
- index.vr.js = entry point for my app
- vr folder = stores the code that launches my app, includes index.htmland client.js files
- static_assets = stores images, audio files, and other external resources
And hereâs the new file structure for React 360:
- index.js = entry point for my app
- client.js = sets up the âruntime,â which turns my React components into 3D elements in our VR landscape
- index.html = as in the typical React application, provides a place for me to mount my React code
- static_assets = stores images, audio files, and other external resources
I set up the rest of my folder structure as follows:
- components // shared components - base-button - content- consts- providers // Recompose providers live here- scenes - home-environment - components - menu - title - zen-button - zens - zen-environment - components - home-button - mantra - static-assets - images - sounds
Shared components live in the top-level components folder. Stored in scenes,my HomeEnvironmentâââthe first environment to load, where my user accesses a menu of meditation environments to exploreâââand ZenEnvironment scenes each have their own sets of relevant components. My state management is handled by Recompose providers and functionally composed into each component that needs access to state.
Mounting the app
In React VR, my client.js was pretty simple and didnât give me too many configuration options:
// React VR application -- vr/client.js// Auto-generated content.// This file contains the boilerplate to set up your React app.// If you want to modify your application, start in "index.vr.js"
import { VRInstance } from "react-vr-web";function init(bundle, parent, options) { const vr = new VRInstance(bundle, "MeditationApp", parent, { cursorVisibility: "auto", // Add custom options here ...options, });
vr.render = function() { // Any custom behavior you want to perform on each frame goes here }; // Begin the animation loop vr.start(); return vr;}window.ReactVR = {init};
In React 360, I can mount my applicationâs content to a surface or a location. Surfaces, as the docs say, âallow you to add 2D interfaces in 3D space, letting you work in pixels instead of physical dimensions.â In my case, I wrap the visual content of my application in an AppContent component, which I mount to React 360âs default, cylindrical surface. This surface projects the content onto the inside of a cylinderâââcentered in front of the userâââwith a 4 meter radius.
I can create my own custom surfaces in React 360, increasing or decreasing the radius or making the surface flat rather than cylindrical.
I also mount the entire app itself to React 360âs default location, which allows my app to take advantage of React 360âs runtime.
The new runtime is one of React 360âs significant advantages over React VR. Why? Separating out the rendering or âruntimeâ aspects of the application from the application code improves the latency: the time between a user action and the time the pixels in the view update in response to that action. If the data transfer is too slow, it results in a choppy, disorienting view for the userâââsimilar to buffering on a Youtube video or static on a television screen.
As the React 360 docs further explain, web browsers are single-threaded, which means that as part of the app updates behind the scenes, that process could block or hinder data transfer. âThis is especially problematic for users viewing your 360 experience on a VR headset, where significant rendering latency can break the sense of immersion,â the docs tell us, âBy running your app code in a separate context, we allow the rendering loop to consistently update at a high frame rate.â
If the data transfer is too slow, it results in a choppy, disorienting view for the userâââsimilar to buffering on a Youtube video or static on a television screen.
In my index.js, I register my MeditationApp (see second code block below) to mount to the default locationâââgiving my entire application access to the runtimeâââwhile I register the content I want to display (again, stored in AppContent) to the default cylindrical surface.
// components/content.js
import React from "react";import { View } from "react-360";import { HomeEnvironment, ZenEnvironment } from "../../scenes";import { withAppContext } from "../../providers";
const AppContent = withAppContext(() => ( <View> <HomeEnvironment /> <ZenEnvironment /> </View>));
export default AppContent;
// index.js
import React from "react";import { AppRegistry, View,} from "react-360";import { AppContent } from "./components";import { withAppContext } from "./providers";
const MeditationApp = withAppContext(() => ( <View style={{ transform: [{ translate: [0, 0, -2] }] }}> <AppContent /> </View>));
AppRegistry.registerComponent("AppContent", () => AppContent);AppRegistry.registerComponent("MeditationApp", () => MeditationApp);
My client.js deals with mounting my component to locations and surfaces:
// client.jsimport { ReactInstance, Surface } from "react-360-web";
function init(bundle, parent, options = {}) { const r360 = new ReactInstance(bundle, parent, { fullScreen: true, // Add custom options here ...options, });
r360.renderToSurface( r360.createRoot("AppContent", { /* initial props */ }), r360.getDefaultSurface() );
r360.renderToLocation( r360.createRoot("MeditationApp", { /* initial props */ }), r360.getDefaultLocation(), );
r360.compositor.setBackground( r360.getAssetURL("images/homebase.png") );}
window.React360 = {init};
Playing audio
In my consts folder, I created a zens.js file in which to quickly store my dataâââincluding the correct audio file and imageâââfor each environment:
const zens = [ { id: 1, mantra: "Find your inner motherforking peace", image: "images/hawaii_beach.jpg", audio: "sounds/waves.mp3", text: "I'm feeling beachy keen", }, { id: 2, mantra: "Breathe in peace, breathe out bullshirt", image: "images/horseshoe_bend.jpg", audio: "sounds/birds.mp3", text: "Ain't no mountain high enough", }, { id: 3, mantra: "Benches will be benches", image: "images/sunrise_paris_2.jpg", audio: "sounds/chimes.mp3", text: "I want a baguette", }, { id: 4, image: "images/homebase.png", text: "Home" }]
export default zens;
To play audio in my React VR scenes, I used a Sound component, which took in a URL for a sound file in the static_assets folder as a source prop. To prevent audio from playing in environments where it didnât belongâââsuch as the home environmentâââI implemented logic via Recompose for âhidingâ and âshowingâ the Sound component based on whether or not we were in an environment with no audio files associated with it.
// React VR -- components/audio.js
import React from "react";import { Sound } from "react-vr";import zens from "../consts/zens.js";import { compose } from "recompose";import { asset } from "react-vr";import { hideIf, usingAppContext } from "../providers/index.js";
const hideIfNoAudioUrl = hideIf(({ selectedZen }) => { const zenAudio = zens[selectedZen - 1].audio; return zenAudio === null || zenAudio === undefined || zenAudio.length === 0;});
export default compose( usingAppContext, hideIfNoAudioUrl,)(({ selectedZen }) => { const zenAudio = zens[selectedZen - 1].audio; return ( <Sound source={asset(zenAudio)} /> )});
React 360 greatly improves upon this. For playing audio, I use the AudioModuleNative Module. Its playEnvironmental method allows me to provide a path (to the audio in our assets folder) and a volume at which to play said audio at a looping pace. Once the audio file stops playing, itâll start again.
Along the way, I realized I need to tell my application when to stop playing a particular audio file when switching scenes. (Otherwise, while immersed in Find Your Zen, you may wind up listening to audio from your previous environmentâââi.e., church bells in a city square in Parisâââafter you navigate back to the home environment). I accomplish this with the AudioModuleâs stopEnvironmental method.
Keep reading to see this in actionâŠ
Using Images
In React VR, I used a Pano component to display a 360 degree photo. To display a specific image, Pano, like Audio, took in an assets URL as a sourceprop. Based on which environment the user selected, the appâs state updated to display an image for that environment.
// React VR -- components/wrapped-pano.js
import React from "react";import { Pano } from "react-vr";import { usingAppContext } from "../providers/index.js";import { Audio } from "../components/index.js";import zens from "../consts/zens.js";import { asset } from "react-vr";
export default usingAppContext(({ selectedZen }) => { return ( <Pano source={asset(zens[selectedZen - 1].image)} > <Audio /> </Pano> )});
You may or may not have noticed that, in my React 360 applicationâs client.js, I write the following line after rendering my applicationâs components:
r360.compositor.setBackground(r360.getAssetURL("images/homebase.png"));
This line of code, which immediately sets the background image when the app is first mounted, uses the asset utility from React 360 to automatically look inside mystatic_assets folder for the correct image.
Thatâs all well and good, but I still want to change the image based on which environment the user selects. Thankfully, I can handle dynamic images from within a React event by using React 360âs Environment module. Hereâs some sample usage:
Environment.setBackgroundImage(asset(someImage));
To pull it all together, hereâs how I dynamically set my background image and audio based on which environment the user selects, using Recomposeâs withState and withHandlers functions:
// providers/withStateAndHandlers.js
import React from "react";import { withState, withHandlers, compose } from "recompose";import { Environment, asset, NativeModules } from "react-360";const { AudioModule } = NativeModules;import { zens } from "../consts";
const withStateAndHandlers = compose( withState("selectedZen", "zenClicked", 4), withHandlers({ zenClicked: (props) => (id, evt) => { Environment.setBackgroundImage(asset(zens[id - 1].image)); if (zens[id - 1].audio !== null && zens[id - 1].audio !== undefined) { AudioModule.playEnvironmental({
Disclaimer
The views and opinions expressed in this article are solely those of the authors and do not reflect the views of Bitcoin Insider. Every investment and trading move involves risk - this is especially true for cryptocurrencies given their volatility. We strongly advise our readers to conduct their own research when making a decision.