chadwpalm
Well-Known Member
- In the Parks
- No
That's a really interesting idea actually, I think we could get to a point where that's a reality relatively soon. It would probably start out as a sort of proprietary thing like a Yaw camera / camera module just for a Yaw program and eventually it could lead to a universal system of cameras detecting movement data and translating it to some sort of data that other programs can interpret.
You don't need a rig. Accelerometer chips are like 1 square centimeter and can detect motion in all three axis. They're in your phones.I suppose a motion control camera rig could do this today but it wouldn't be something you could just take on a ride with you.
You could just embed one into a digital camera and you would just need to simultaniously record the raw data from the accelerometer to a file with the same time stamp as the video. Then the raw data can be converted for use with whatever format the Yaw VR understands. One of my labs in my embedded systems course did something similar, though we were just outputting the data from the accelermeter to the computer screen. You don't even need to have it in the camera. I could probably build a small box in under a week that does this with just a microcontroller, acceleromter and microSD card slot. Only need 3.3V to do it all.
If I lived near DL and had a 360 VR cam I would probably try it. Wouldn't be hard to write a simple C++ program to translate the data. Not that these Yaw VR's are in abundance and there would be demand for it, but it wouldn't be hard.