expo-gl
provides a View
that acts as an OpenGL ES render target, useful for rendering 2D and 3D graphics. On mounting, an OpenGL ES context is created. Its drawing buffer is presented as the contents of the View
every frame.Android Device | Android Emulator | iOS Device | iOS Simulator | Web |
---|---|---|---|---|
→
expo install expo-gl
If you're installing this in a bare React Native app, you should also follow these additional installation instructions.
import React from 'react'; import { View } from 'react-native'; import { GLView } from 'expo-gl'; export default function App() { return ( <View style={{ flex: 1, justifyContent: 'center', alignItems: 'center' }}> <GLView style={{ width: 300, height: 300 }} onContextCreate={onContextCreate} /> </View> ); } function onContextCreate(gl) { gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight); gl.clearColor(0, 1, 1, 1); // Create vertex shader (shape & position) const vert = gl.createShader(gl.VERTEX_SHADER); gl.shaderSource( vert, ` void main(void) { gl_Position = vec4(0.0, 0.0, 0.0, 1.0); gl_PointSize = 150.0; } ` ); gl.compileShader(vert); // Create fragment shader (color) const frag = gl.createShader(gl.FRAGMENT_SHADER); gl.shaderSource( frag, ` void main(void) { gl_FragColor = vec4(0.0, 0.0, 0.0, 1.0); } ` ); gl.compileShader(frag); // Link together into a program const program = gl.createProgram(); gl.attachShader(program, vert); gl.attachShader(program, frag); gl.linkProgram(program); gl.useProgram(program); gl.clear(gl.COLOR_BUFFER_BIT); gl.drawArrays(gl.POINTS, 0, 1); gl.flush(); gl.endFrameEXP(); }
import { GLView } from 'expo-gl';
View
props for layout and touch handling, the following props are available:gl
that has a WebGLRenderingContext interface.GLView
can enable iOS's built-in multisampling. This prop specifies the number of samples to use. By default this is 4. Setting this to 0 turns off multisampling. On Android this is ignored.true
if given context existed and has been destroyed successfully.{ x: number, y: number, width: number, height: number }
) -- Rect to crop the snapshot. It's passed directly to glReadPixels
.false
.'jpeg'
, 'png'
or 'webp'
(Android only for the latter). Specifies what type of compression should be used and what is the result file extension. PNG compression is lossless but slower, JPEG is faster but the image has visible artifacts. Defaults to 'jpeg'
.1.0
.Note: When using WebP format, the iOS version will print a warning, and generate a'png'
file instead. It is recommendable to use platform dependant code in this case. You can refer to the documentation on platform specific code.
{ uri, localUri, width, height }
where uri
is a URI to the snapshot. localUri
is a synonym for uri
that makes this object compatible with texImage2D
. width, height
specify the dimensions of the snapshot.GLView
underneath. The following libraries integrate popular graphics APIs:document
). Usually this is for resource loading or event handling, with the main rendering logic still only using pure WebGL. So these libraries can usually still be used with a couple workarounds. The Expo-specific integrations above include workarounds for some popular libraries.gl
object received through the onContextCreate
prop becomes the interface to the OpenGL ES context, providing a WebGL API. It resembles a WebGL2RenderingContext in the WebGL 2 spec. However, some older Android devices may not support WebGL2 features. To check whether the device supports WebGL2 it's recommended to use gl instanceof WebGL2RenderingContext
.
An additional method gl.endFrameEXP()
is present which notifies the context that the current frame is ready to be presented. This is similar to a 'swap buffers' API call in other OpenGL platforms.getFramebufferAttachmentParameter()
getRenderbufferParameter()
compressedTexImage2D()
compressedTexSubImage2D()
getTexParameter()
getUniform()
getVertexAttrib()
getVertexAttribOffset()
getBufferSubData()
getInternalformatParameter()
renderbufferStorageMultisample()
compressedTexImage3D()
compressedTexSubImage3D()
fenceSync()
isSync()
deleteSync()
clientWaitSync()
waitSync()
getSyncParameter()
getActiveUniformBlockParameter()
pixels
argument of texImage2D()
must be null
, an ArrayBuffer
with pixel data, or an object of the form { localUri }
where localUri
is the file://
URI of an image in the device's file system. Thus an Asset
object could be used once .downloadAsync()
has been called on it (and completed) to fetch the resource.import React from 'react'; import { View } from 'react-native'; import { runOnUI } from 'react-native-reanimated'; import { GLView } from 'expo-gl'; function render(gl) { 'worklet'; // add your WebGL code here } function onContextCreate(gl) { runOnUI((contextId: number) => { 'worklet'; const gl = GLView.getWorkletContext(contextId) render(gl); })(gl.contextId); } export default function App() { return ( <View style={{ flex: 1, justifyContent: 'center', alignItems: 'center' }}> <GLView style={{ width: 300, height: 300 }} onContextCreate={onContextCreate} /> </View> ); }
expo-gl
with Reanimated and Gesture Handler you can check this example.'worklet'
added at the start.expo-assets
you can just pass asset object returned by Asset.fromModule
or from hook useAssets
to the runOnUI
function.requestAnimationFrame
, APIs like setTimeout
are not supported.