The most obvious conclusion about the general character of MBone deployment is that areas with fewer network resources and limited numbers of redundant links seem to have more efficient tunnel placements. The European tunnel structure shown in Figure 4 is much closer to a hierarchical distribution tree than the that of the United States.
The recent privatization of the Internet has led to the fragmentation of the former hierarchical US structure. Figure 7 classifies tunnels by domain status, as to whether one, both, or no endpoints belong to a major Internet Service Provider. Note that most, but certainly not all, of the cross-country links have one or both ends on a backbone. In Figure 6 we further categorize tunnels for which at least one endpoint host belongs to one of the major backbones. We see that although there are many tunnels between the East and West Coast, each provider, when taken alone, has an appropriate amount of redundancy. Nevertheless, from the broader view, the large number of tunnels all carrying the same information between the coasts is not the most optimal use of limited Internet resources.
All figures thus far have shown the MBone on the same day. Figure 5 compares the changes in the MBone across a four-month period of time. In both figures we highlight the Sprintlink tunnels and focus on Texas. We see that in February Texas A&M University has configured a tunnel to a Sprint hub in Washington DC. Notice that by June Sprint has extended their tunnel support to a major hub in Fort Worth, but the university has not leveraged the new topology and still uses the now suboptimal Texas-to-DC tunnel. Such configurations are much easier to see with the geographic visualization than with the raw text data.