pre The project is a VR project. How are you gonna do VR in a ray-tracer at 90 frames a second?
Ok. So you need it to run in realtime and capture it into 360 stereo video? You should have mentioned that in the first post. Then the pixel columns approach probably won't be fast enough as your rendering shaders would need to process the whole scene vertex data as many times per frame as there are columns.
The likely bottleneck here is on GPU side, so doing it in native code wouldn't make much difference. The whole feature may not really be feasible in realtime, except maybe for very light (vertex-wise) scenes. But I could be wrong in this estimate. Go and implement it using viewports, shaders and GDScript, then profile it and see where the bottlenecks are. If they are indeed on the CPU side then you can think of porting what you have into native code. On the other hand, if they are on the GPU side, there's not much you can do about it other than try a different approach.
Do you know if somebody else managed to implement this in a project or a commercial product?