Post comments and corrections to http://lists.theta360.guide in a public category or direct message @jcasman or @craig on the lists system. You must be logged in to send a message.
This is an unofficial, community-generated guide for developing applications that use 360 media from RICOH THETA cameras. This is not authorized by RICOH and is based on publicly available information.
Get the latest news and updates on Twitter @theta360dev
1. Resources
1.1. Official Information
-
RICOH THETA API v2, compliant with Open Spherical Camera API version 1.0 from Google.
1.2. Unofficial Information
-
GitHub listing of community repositories - libraries and code examples from the community
-
Blog for developers - How-To articles from the community
-
Guides
-
Unofficial API Guide - Use the RICOH THETA S API, a HTTP API based on Open Spherical Camera (OSC) specifications. Beginners should start here. Web browser using advanced HTTP clients, command line curl, Python. Planned examples for JavaScript and other languages.
-
Unofficial Media Guide - 360 degree media management. Streaming video, equirectangular conversion, stitching
-
RICOH THETA Developers SF Bay Area Meetup Group - Focused on developers and companies using the RICOH THETA camera
-
2. 360 Video Formats
Video is stored on the camera in dual fish-eye mode.
The dual-fisheye MP4 file has an image size of 1920x1080 or 1280x720.
The official RICOH desktop app will convert this to equirectangular
mode. By default, the program will take your file name and add _er
to it.
The file is saved in the same directory that the original video file was in.
Both the input and output are .mp4
format videos.
The program will convert it to equirectangular. Note that there is a minor color shading difference on the right side of the screen grab. The camera has two lenses. Each lens captures and saves one sphere. The software on the computer stitches the image together. The RICOH app adds the meta video meta data in for you.
if you use something like Adobe Premiere Pro to edit your videos, you will need to inject the meta data again |
You can now load the video up to YouTube to your normal channel. YouTube will grab the meta data and add the 360 degree video controls automatically. This is cool!
I’ve successfully tested the YouTube viewer on Windows and Mac with different browsers and on Linux with FireFox. It does not appear to work on Linux with Chrome. |
3. Video Live Streaming
The THETA S can stream video when connected to a USB or HDMI cable. The output is in dual fish-eye mode (two spheres). To make the video usable in a web browser or with external services, you’ll need to convert the video into equirectangular mode.
To view a live streaming video with 360 degree navigation, you can connect it to a YouTube live streaming event.
YouTube live streaming needs to be an Event and not Stream now. You must check This live stream is 360 under the Advanced Settings tab. |
Follow this guide for RICOH THETA 360 degree live streaming on YouTube.
Type | Format | Camera Mode | Size | Frame Rate | Connection |
---|---|---|---|---|---|
Live View |
Equirectangular in MotionJPEG |
Image Only |
640x320 |
10fps |
WiFi |
USB Live Streaming of dual-fisheye |
Dual-fisheye in MotionJPEG |
live streaming |
1280x720 |
15fps |
USB isochronous transfer |
HDMI live streaming of dual-fisheye |
Dual-fisheye in uncompressed YCbCr |
live streaming |
1920x1080, 1280x720, 720x480 |
30fps |
HDMI |
USB live streaming of equirectangular |
Equirectangular in MotionJPEG |
live streaming |
1280x720 |
15fps |
USB |
There are pros and cons of each method.
Live Stream Type | Pro | Con | Community Implementation |
---|---|---|---|
USB equirectangular |
easy. appears as webcam to your application |
Low resolution (720). Slow framerate (15fps) |
Looking forward to seeing this after release! Hope they output to HDMI |
HDMI conversion to equirectangular |
highest resolution and framerate. Most common input for broadcast equipment |
difficult. no packaged solution. people are building their own. |
most people using Unity and then connecting to third-party equipment for live event broadcasting |
USB dual-fisheye |
easy. appears as webcam. higher resolution than equirectangular. less load on CPU. |
It’s in dual-fisheye, so you’ll need to build an application to process it |
Experimenting with facial and object recognition, suveillance |
WiFi Live View |
Works over WiFi. equirectangular. |
low resolution. slow. not designed for live streaming, just preview |
preview scenes |
Unless you are using
camera._getLivePreview
in image mode to display a low-resolution live view
with low frame rate, the first step is to get the camera into
live streaming mode
3.1. For USB
-
Press “mode button” - keep pressing - and press “power button” → camera goes to the LiveVideoStreamig mode.
-
Connect a usb cable with S and laptop (MAC or PC).
-
The THETA S can be used as a web cam. You can use web cam software such as Skype to see live video streaming with the THETA S.
3.2. For HDMI
-
Press “mode button” - keep pressing - and press “power button” → camera goes to LiveVideoStreamig mode.
-
connect a hdmi cable with S and a monitor.
-
S could be a output video device. The monitor shows the S’s live video streaming .
3.3. API Testing
{"name": "camera.getOptions", "parameters": { "sessionId": "SID_0003", "optionNames": [ "captureMode" ] } }
Remember to set your sessionId correctly |
The response:
{ "name": "camera.getOptions", "state": "done", "results":{ "options":{ "iso": 0, "isoSupport":[], "captureMode": "_liveStreaming" } } }
3.4. RICOH Live Streaming Driver (THETA UVC Blender) with Equirectangular Output
The USB driver appears as a web cam to applications running a Mac or PC. In the example above, the equirectangular video is shown streaming in QuickTime. A video clip that shows the output of the THETA S USB live streaming is available on YouTube.
You must install the live streaming driver, THETA UVC Blender. Download it from
This walkthrough is for Windows 10 64 bit.
Right click on the UVCBlender_setup64_en.exe icon. Run as administrator
Run through the install wizard.
After installation, you will need to connect a THETA that is powered off to register the device. You may need to reboot. In my tests, I could not advance to the UVC register step without a reboot. If I try and register the device without rebooting, I see a flashing Establishing a Connection dialog, but the connection is never completed.
If you need to reboot, run the THETA UVC Register application as administrator.
After a reboot and starting THETA UVC Register as administrator, I then plug the THETA in. The THETA is turned off. The connection is established.
The registration completes.
At this stage, you may need to reboot again. If you do, it is a one-time requirement.
Now, press power and mode to start the THETA S in live streaming mode. Test it with a common video streaming application such as Google Hangout. I have heard of people having problems displaying the stream. Start your application first with the camera unplugged. Once the application is running, select THETA UVC Blender, then start your THETA in live streaming mode, and plug it in. There are probably other ways to get the application to recognize the live stream, but this sequence consistently works for me.
Go into the settings of Google Hangout to select the THETA UVC Blender webcam.
Here’s an example with Open Broadcaster Software.
Create a new Scene called THETA Test. Right click in Sources to Add a new Video Capture Device
Under the Device Selection window, select THETA UVC Blender. Click OK.
The video stream will fill a portion of the screen. Select Edit Scene to size the video stream to fit.
You can use OBS or other software to stream the image to YouTube. Refer to the blog post below for more information:
3.5. Example with Processor Language
Community Contribution from Sunao Hashimoto, kougaku on GitHub. Full sample source code is available.
The example above is built with Processing.
Additional information is on his blog post in Japanese.
3.6. Examples with 3D Tools such as Unity and Maya LT
Nora, @steroarts, released a shader pack to convert THETA 360 degree media into equirectangular format in real time.
The developer below, Goroman, was able to get reasonable 360 video live streaming in equirectangular mode after an hour of work back in September, 2015. Additional information in Japanese is here.
The section below was translated from Japanese by Jesse Casman.
hecomi recently wrote an interesting blog post u sing Unity to view realtime 360 degree video streaming. I personally have very little experience with Unity, but the content and pictures were useful, so I translated the blog post for others to use. This is not an exact translation, but it should be much more clear than doing Google translate.
I noticed GOROman ([@GOROman](https://twitter.com/GOROman)) tweeting about the info below, so I decided to try it myself right then and there.
@GOROman tweet: You should be able to easily build a live VR stream from this. Stitching might be an issue… For the time being, it might be fine to just connect the sphere to the UV texture.
The THETA S…includes features like Dual-Eye Fisheye streaming over USB (1280x720 15fps) and HDMI streaming (1920x1080 30fps). In order to view this using Unity, I made a an appropriately UV developed sphere and a shader appropriate to AlphaBlend border. Ultimately, for the purpose of making a full sphere with the THETA S, it would be much higher quality and more convenient (can use Skybox too!) to use the fragment shader, made by Nora @Stereoarts), which directly writes Equirectangular onto a plane.
@Stereoarts tweet: I’ve released a Theta Shader Pack. A shader for converting THETA / THETA S full sphere video to Equirectangular in Unity and supporting scripts. stereoarts.jp/ThetaShaderPack_20150926.zip
For this article, I wanted to jot down my techniques as well.
Sample
Example of taking a video with THETA
The THETA S gives beautifully separated spheres. The angle covered in one sphere is slightly larger than 180 degrees.
In doing this, I’ve made an adjustment using sample texture, which Goroman filmed using WebCamTexture.
Making a sphere with good UV setting
Working with Maya LT, the UV comes out like this, if you make a normal sphere.
It would look like below, if you make a plane with the UV.
All it needs is to be cut in half and be moved appropriately.
It looks like this. (I did not adjust it, so it might be slightly off.)
Actually, I wanted to use alphablend for the border, so I used 2 overlapping half spheres instead of one sphere. The UV border is adequately stretched manually.
Incidentally, surface is set to face inward, by reversing all normal vectors. UV position and size are fine to adjust later with shader.
Setting with Unity
Import the Maya LT built model with Unity, and put the camera in the center. Write a shader, so the model’s UV position can be adjusted or can alphablend. In order to control the drawing order and to prevent the border from changing at certain orientations, each half sphere has a different shader.
Shader "Theta/Sphere1" { Properties { _MainTex ("Base (RGB)", 2D) = "white" {} _AlphaBlendTex ("Alpha Blend (RGBA)", 2D) = "white" {} _OffsetU ("Offset U", Range(-0.5, 0.5)) = 0 _OffsetV ("Offset V", Range(-0.5, 0.5)) = 0 _ScaleU ("Scale U", Range(0.8, 1.2)) = 1 _ScaleV ("Scale V", Range(0.8, 1.2)) = 1 _ScaleCenterU ("Scale Center U", Range(0.0, 1.0)) = 0 _ScaleCenterV ("Scale Center V", Range(0.0, 1.0)) = 0 } SubShader { Tags { "RenderType" = "Transparent" "Queue" = "Background" } Pass { Name "BASE"
Blend SrcAlpha OneMinusSrcAlpha Lighting Off ZWrite Off
CGPROGRAM #pragma vertex vert_img #pragma fragment frag
#include "UnityCG.cginc"
uniform sampler2D _MainTex; uniform sampler2D _AlphaBlendTex; uniform float _OffsetU; uniform float _OffsetV; uniform float _ScaleU; uniform float _ScaleV; uniform float _ScaleCenterU; uniform float _ScaleCenterV;
float4 frag(v2f_img i) : COLOR { // 中心位置や大きさを微調整 float2 uvCenter = float2(_ScaleCenterU, _ScaleCenterV); float2 uvOffset = float2(_OffsetU, _OffsetV); float2 uvScale = float2(_ScaleU, _ScaleV); float2 uv = (i.uv - uvCenter) * uvScale + uvCenter + uvOffset; // アルファブレンド用のテクスチャを参照してアルファを調整 float4 tex = tex2D(_MainTex, uv); tex.a *= pow(1.0 - tex2D(_AlphaBlendTex, i.uv).a, 2); return tex; } ENDCG } } }
Here’s a second section of code.
Shader "Theta/Sphere2" { Properties { _MainTex ("Base (RGB)", 2D) = "white" {} _AlphaBlendTex ("Alpha Blend (RGBA)", 2D) = "white" {} _OffsetU ("Offset U", Range(-0.5, 0.5)) = 0 _OffsetV ("Offset V", Range(-0.5, 0.5)) = 0 _ScaleU ("Scale U", Range(0.8, 1.2)) = 1 _ScaleV ("Scale V", Range(0.8, 1.2)) = 1 _ScaleCenterU ("Scale Center U", Range(0.0, 1.0)) = 0 _ScaleCenterV ("Scale Center V", Range(0.0, 1.0)) = 0 } SubShader { Tags { "RenderType" = "Transparent" "Queue" = "Background+1" } UsePass "Theta/Sphere1/BASE" } }
As below, for alphablend, have a texture made, that is alpha adjusted to UV. I made adjustment for perfectly fit, by exporting UV with postscript and reading with illustrator (white circle inside is alpha=1; around the circle, from inside to outside, changes from 1 to 0; outside will not be used so whatever fits.)
Then, adjust the parameters and you’ve got a whole sphere.
Changing into Equirectangular
I tried it with a modified vertex shader.
Shader "Theta/Equirectangular1" { Properties { _MainTex ("Base (RGB)", 2D) = "white" {} _AlphaBlendTex ("Alpha Blend (RGBA)", 2D) = "white" {} _OffsetU ("Offset U", Range(-0.5, 0.5)) = 0 _OffsetV ("Offset V", Range(-0.5, 0.5)) = 0 _ScaleU ("Scale U", Range(0.8, 1.2)) = 1 _ScaleV ("Scale V", Range(0.8, 1.2)) = 1 _ScaleCenterU ("Scale Center U", Range(0.0, 1.0)) = 0 _ScaleCenterV ("Scale Center V", Range(0.0, 1.0)) = 0 _Aspect ("Aspect", Float) = 1.777777777 } SubShader { Tags { "RenderType" = "Transparent" "Queue" = "Background" } Pass { Name "BASE"
Blend SrcAlpha OneMinusSrcAlpha Lighting Off ZWrite Off
CGPROGRAM #pragma vertex vert #pragma fragment frag #define PI 3.1415925358979
#include "UnityCG.cginc"
uniform sampler2D _MainTex; uniform sampler2D _AlphaBlendTex; uniform float _OffsetU; uniform float _OffsetV; uniform float _ScaleU; uniform float _ScaleV; uniform float _ScaleCenterU; uniform float _ScaleCenterV; uniform float _Aspect;
struct v2f { float4 position : SV_POSITION; float2 uv : TEXCOORD0; };
v2f vert(appdata_base v) { float4 modelBase = mul(_Object2World, float4(0, 0, 0, 1)); float4 modelVert = mul(_Object2World, v.vertex);
float x = modelVert.x; float y = modelVert.y; float z = modelVert.z;
float r = sqrt(x*x + y*y + z*z); x /= 2 * r; y /= 2 * r; z /= 2 * r;
float latitude = atan2(0.5, -y); float longitude = atan2(x, z);
float ex = longitude / (2 * PI); float ey = (latitude - PI / 2) / PI * 2; float ez = 0;
ex *= _Aspect;
modelVert = float4(float3(ex, ey, ez) * 2 * r, 1);
v2f o; o.position = mul(UNITY_MATRIX_VP, modelVert); o.uv = MultiplyUV(UNITY_MATRIX_TEXTURE0, v.texcoord); return o; }
float4 frag(v2f i) : COLOR { float2 uvCenter = float2(_ScaleCenterU, _ScaleCenterV); float2 uvOffset = float2(_OffsetU, _OffsetV); float2 uvScale = float2(_ScaleU, _ScaleV); float2 uv = (i.uv - uvCenter) * uvScale + uvCenter + uvOffset; float4 tex = tex2D(_MainTex, uv); tex.a *= pow(1.0 - tex2D(_AlphaBlendTex, i.uv).a, 2); return tex; } ENDCG } } }
Here’s a second section of code.
Shader "Theta/Equirectangular2" { Properties { _MainTex ("Base (RGB)", 2D) = "white" {} _AlphaBlendTex ("Alpha Blend (RGBA)", 2D) = "white" {} _OffsetU ("Offset U", Range(-0.5, 0.5)) = 0 _OffsetV ("Offset V", Range(-0.5, 0.5)) = 0 _ScaleU ("Scale U", Range(0.8, 1.2)) = 1 _ScaleV ("Scale V", Range(0.8, 1.2)) = 1 _ScaleCenterU ("Scale Center U", Range(0.0, 1.0)) = 0 _ScaleCenterV ("Scale Center V", Range(0.0, 1.0)) = 0 _Aspect ("Aspect", Float) = 1.777777777 } SubShader { Tags { "RenderType" = "Transparent" "Queue" = "Background+1" } UsePass "Theta/Equirectangular1/BASE" } }
When taking a look at the mesh, it moves around like this.
Because polygon did not fit, there is a blank space in the corner. This could have been avoided if we have used a direct fragment shader like Nora.
Conclusion
It looks like there’s the possibility of multiple fun topics here like spherical AR and Stabilization. After the THETA S goes on sale, I would love to play with it more.
Update March 9, 2016
Maya Planar UV Mapping Instruction
Thanks to @hecomi for providing this video that shows how to tweak the UV mapping. He produced it to help Flex van Geuns who was trying to use the UV mapping with webgl.
3.7. Community Articles About 360 Display in Unity
4. Image Viewers
The RICOH THETA S will will generate a equirectangular JPEG file of 5376x2688 or 2048x1024.
4.1. Example in Processor language
4.2. Example in Javascript
At the hackathon, we used the open source viewer from Akokubo listed above as part of our real estate demo built by a high school student.
There are other open source JavaScript viewers that we have not tested:
5. Graphic Tech
5.1. HDR
Simple HDR is a great application that works with the RICOH THETA S and m15 models. It is developed by Built Light
Nick Campbell produced a video that gives a good overview of Simple HDR and 360 HDRIs
Another HDR application is HDR 360 Bracket Pro for RICOH THETA by Brad Herman, the CTO and co-founder of SPACES.
6. Linux
RICOH only supports Mac and Windows desktop. As many developers use Linux, we’ve collected some information from the community to help people with basic tasks.
Linux can be used to control the camera HTTP API. There are also a number of scripts to get media from the camera.
Tips from the community:
-
YouTube 360 videos work with Firefox on Linux. Some people have had problems with FireFox
-
If you’re running Linux in VirtualBox as a guest, turn off 3D hardware acceleration
-
There are a large number of viewers at the Panotools.org site. I’ve been using FSPViewer.
If you want to use Linux to download media from the THETA and view it on your Linux box, you can use Wine for image viewing using the THETA Windows app or use a third-party application, Pano2VR.
Documentation below contributed by Svendus
SphericalViewer.exe opens and installs with Wine It runs and you can view Spherical images, but videos are not converted.
Linux users can also import the files and use Pano2VR5.
-
sudo apt-get install --no-install-recommends gnome-panel sudo gnome-desktop-item-edit /usr/share/applications/ --create-new
7. Proprietary Technical Information
7.1. Lens Parameter Information
The lens geometry for the THETA is based on equidistant projection. The final projection style for both videos and images is equirectangular projection. RICOH does not make detailed lens parameter information available. This is also known as lens distortion data. Developers often ask for this information to improve stitching. It is proprietary and not available as of December 2015. Stitching is still possible without this information.
7.2. Stitching Libraries
The RICOH THETA S processes still images inside of the camera. It takes 8 seconds for the camera to be ready to take another still image.
The videos are stored in dual-fisheye format (two spheres). The official RICOH applications will convert this into equirectagular format on either mobile devices or desktop computers. This format can then be viewed locally or uploaded to YouTube, Facebook, or other sites.
The source code and algorithms to perform this stitching are not available to developers.
As of December 2015, there is no way to use the RICOH libraries in live streaming mode.
8. FAQ for Media Editing
8.1. Q: How do I edit video in Adobe Premiere Pro?
A:
-
Download dual-fisheye media file to your computer
-
Convert with official RICOH THETA desktop application. File name will end in _er
-
Edit in Premiere Pro. Change audio track or add special effects with After Effects
-
On your desktop computer inject metadata again using another tool.
-
Upload to YouTube or other 360 degree video player or site
8.2. Q: How do I edit a still image in Photoshop or other image editing software?
Example of use: The photographer wants to lighten an image.
A: DennisMabrey wrote a guide on how to add metadata for the image back into the still image.
Here is the 'workflow':
-
Edit your JPEG in Photoshop
-
EXPORT as JPEG to a NEW FILE. Do not overwrite the original (heck back it up or something).
-
Run EXifToolGUI
-
Select the exported JPEG file
-
Select Menu option Modify/Remove Metadata
-
Select the top option '-remove ALL metadata' and click 'Execute' button
-
Make sure your exported JPEG file is still selected
-
Select Menu option 'Export/Import'/Copy metadata into JPG or TIFF.
-
In the File Dialog select the ORIGINAL panorama JPG file.
-
Make sure ALL options are selected and click 'Execute'
-
If you look at the Metadata tag with the ALL button clicked you should see both a section labeled 'Ricoh' AND one labeled 'XMP-GPano' (Googles XMP pano)
-
Test the exported JPG in the RICOH program. Hopefully it worked.
8.3. Q: How do I connect the camera as a USB storage device?
A: Hold the WiFi and Shutter buttons on your camera while you plug the camera into the USB port of your computer. The camera will appear as RICODCX. This is generally more of a problem on Macs. Make sure you turn off auto-import into Photos. People have experienced problems with importing the 360 images into Photos. Save them to disk and use the RICOH app.
8.4. Q: What are the technical specifications of images and video?
A: The official RICOH site has great information in the overview section.