I gather from your other post that you're talking about adding this to the iOS console.
I don't think it's really easy to do but all depends on your level of expertise in iOS and C development.
The iOS SDK does not support RTSP out of the box, so you need to find and compile a library to provide that service.
This bit should still be OK. If you want to contribute code back to OR, you however need to be careful on the licenses of the libraries you choose.
Juha is the expert on this, but anything Apache, MIT or LGPL should be OK.
But this is only the transport layer, now you need to decode and present the video.
I would take it that the video is H264, which is the most probable format.
Until now (that is up to iOS 7.1), Apple did not expose access to the hardware decoder, which meant that you need to do the decoding in software, using a library like ffmpeg.
I did look at that a very long time ago, but it turned out to be too slow for our application (it was on an iPhone 3G) and I never took it any further.
If you're happy with iOS 8 only support, then you can take advantage of the new API.
There is a specific WWDC session on this topic, for which you can find the video on Apple's dev site.
If you get that part running, which is totally independent of OpenRemote, the rest is pretty simple to do.
It'll need some thoughts on how to integrate within OR but there we can help and there should not be any technical difficulties.
I would thus advice, if you want to add this, to code a simple iOS app to display the video from your camera.
This will avoid you the need to look at OR code and deal with any issues that might arise there.
Once you're there, then we can see how to integrate.
Hope this gives you a better idea on the way forward.