While I am encouraged by the use cases planned for V2X/V2V and the huge amount of thought going in to data integrity and bandwidth I am concerned that the very demanding real-time nature of this domain may not be entirely appreciated. The Commercial IT world usually has a far less demanding belief in what "real-time" is. For example the 10hz rate being discussed is way too slow. I believe there may be an issue where this massive asynchronous system has to adapt to what should be considered synchronous systems that will have significant timing and critical latency related issues. Certain things have to happen at certain times, in a certain order and at a certain rate. Folks in aircraft simulation have been dealing with this for some time. As far as I know most Commercial IT folks don't understand the architecture required. They rely on CPU and network speeds. Which isn't enough. As a matter of fact no WAN many be suitable for the more critically demanding pieces of this. (Especially if satellites are used anywhere in the system.)
Is the architecture actually "real-time"? As in either zero data needs re-transmitting or the system is robust enough to allow for re-transmitting info fast enough so that no one misses it? A car going 75mph goes 110ft or 36 yards a second. Lots can happen in that time. And if two cars are closing on each other that distance is effectively doubled. Let's say there are several vehicles and other devices involved. Some detect something but the data is in conflict. It takes time to have that group of devices ascertain truth then act on it. There is a chain reactive loop here. (Which needs to include electro-mechanical delay in steering and braking). Those system wide round trips of communication have to happen fast enough so no flawed or old data is transmitted resulting in the wrong thing happening.