Hey! I just shipped Farpoint on PSVR as the VFX artist so I have some experience with it
Yeah so sprites work well up until they are about 5 - 10 meters from the camera. You definitely need to use camera fade to alpha them out as they get closer.
Always use soft particles for sprites when they clip into surfaces, otherwise you will see the hard line of the sprite intersection.
I used meshes whenever I could to help add depth to the effect. For example, when a spider is killed I used a liquid shaped static mesh along with some sprites to add volume. Or for an explosion, I had a 6 plane mesh that I used the main explosion flipbook on and it worked pretty well to add some depth.
Another useful practice was adding as many small GPU particles as I could. They feel great in VR and help add depth to the effect. These can be sparks, blood drops, small rocks, snow, etc. It feels noisy on a monitor but for some reason it feels awesome in VR.
And as Partikel says, the camera facing rotation of sprites can look really bad in VR. In Unreal, the best settings I found to solve this was to use PSA Facing Camera Position and check on Remove HMD Roll, which they just added.
PSVR had a ton of hardware limitations so I had to be very careful with overdraw, memory usage, render targets (refraction), cpu tick on particle systems, etc. Performance is a huge problem in VR as you can imagine. The number one helpful thing I can recommend is LODing your particle systems so when they go off near you, reduce the hell out of the overdraw. Overdraw will destroy your framerate in VR. I usually had 4 Lods per Explosion where the nearest I would turn off any large sprite. The second I would use a mesh for the main sprite to add depth. The 3rd LOD would be mostly sprites because it is further away. And maybe a 4th for very far away to get rid of small stuff like sparks. This thought process was applied to every particle emitter in the game.
This is just a quick rundown but feel free to ask me any questions.